DBTA E-EDITION
September 2012
Subscribe to the online version of Database Trends and Applications magazine. DBTA will send occasional notices about new and/or updated DBTA.com content.
Trends and Applications
Are today's data systems — many of which were built and designed for legacy systems of the past decade — up to the task of moving information to end users at the moment they need it? And is this information timely enough? In many cases, there's a lot of work that still needs to be done before real-time information, drawn from multiple sources, becomes a reality. A new survey of 338 data managers and professionals who are subscribers to Database Trends and Applications reveals that real-time data access is still a distant pipe dream for at least half of the companies represented in the survey. The survey, conducted by Unisphere Research, a division of Information Today, Inc., in partnership with Attunity in March of 2012, finds that close to half of the survey respondents, 48%, report that relevant data within their organizations still take 24 hours or longer to reach decision makers. This suggests that much data is still batch-loaded overnight.
In recent years, the networks of developers, integrators, consultants, and manufacturers committed to supporting database systems have morphed from one-on-one partnerships into huge ecosystems in which they have become interdependent on one another, and are subject to cross-winds of trends and shifts that are shaping their networks. Nowhere is this more apparent than the huge ecosystem that has developed around Oracle. With Oracle's never-ending string of acquisitions, new functionality, and widespread adoption by enterprises, trends that shape this ecosystem are certain to have far-reaching effects on the rest of the IT world. Concerns that percolate through the ecosystem reflect — and influence — broad business concerns. New paradigms — from cloud computing to big data to competing on analytics — are taking root within the Oracle ecosystem long before anywhere else.
Address data quality is at the heart of all business operations. Inaccurate addresses can cost businesses anywhere between a few cents to many dollars per customer interaction. Undelivered products or information could result in dissatisfied or even lost customers, a cost that is far greater than the fee to implement an automated address check. Here, a list of the top 10 tips to select the right address data quality provider.
Columns - Applications Insight
The first computer program I ever wrote (in 1979, if you must know) was in the statistical package SPSS (Statistical Package for the Social Sciences), and the second computer platform I used was SAS (Statistical Analysis System). Both of these systems are still around today—SPSS was acquired by IBM as part of its BI portfolio, and SAS is now the world's largest privately held software company. The longevity of these platforms—they have essentially outlived almost all contemporary software packages—speaks to the perennial importance of data analysis to computing.
Columns - Database Elaborations
It seems easy to fall into a state where projects and activities assume such soft-focus that routine takes control, where one simply does necessary tasks automatically, no questions are raised regarding what is moving through the work-life production line and everyone is essentially asleep at the switch. Certainly, we may have one eye open ensuring that within a broad set of parameters all is well, but as long as events are basically coloring inside the borders we continue allowing things to just move along. In this semi-somnambulant state we can easily add columns to tables, or even add new entities and tables, or triggers and procedures to our databases, then eventually at some point down the road have someone turn to us and ask, "Why this?" or, "What does this really mean?" And at that point, we surprise ourselves with the discovery that the only answer we have is that someone else told us it was what we needed, but we do not really understand why it was needed.
Columns - DBA Corner
If you've worked with relational database systems for any length of time, you've probably participated in a discussion (argument?) about the topic of this month's column, surrogate keys. A great debate rages within the realm of database developers about the use of "synthetic" keys. And if you've ever Googled the term "surrogate key," you know the hornet's nest of opinions that swirls around on the topic. For those who haven't heard the term, here is my attempt at a quick summary: A surrogate key is a generated unique value that is used as the primary key of a database table; database designers tend to consider surrogate keys when the natural key consists of many columns, is very long, or may need to change.
Columns - Quest IOUG Database & Technology Insights
If you have not heard the buzz around Oracle's Engineered Systems, then you must not be keeping up with Oracle's marketing promotions, reading any technical journals, or even just walking through any airports lately. While the marketing claims are impressive, the use of Marvel's characters was imaginative and Oracle's use of the slogan "Hardware and Software Engineered to Work Together" is enticing, you may be asking yourself, "What does all of this mean to me?" Well, let's illustrate by example.
Columns - SQL Server Drill Down
SQL Server 2012 introduces a lot of new features which, like the columnstore indexes I discussed last month, are inspiring a lot of excitement in the user community. However, there's been a bit of confusion around the set of features commonly known as AlwaysOn.
MV Community
Entrinsik, developer of Informer reporting and business intelligence software (BI), is expanding its training and implementation services to offer key performance indicator (KPI) consulting and development to customers using Informer Dashboards, enabling them to visualize real-time data from multiple sources on one screen to quickly identify actionable information and key trends. "Working with our customers to develop the right KPIs for their business helps them to improve decision making through the greater use of more data-driven decision making," says Sharon Shelton, vice president of marketing at Entrinsik.
Australia-based Koorong Books has migrated its enterprise computer system to run on the InterSystems CACHÉ high-performance object database, enhancing its legacy system and improving its performance and integration. One of the most important characteristics of the next generation of applications is that they will utilize all of the data in the enterprise, Robert Nagle, vice president of Software Development, InterSystems, tells DBTA.
BlueFinity's upcoming release of mv.NET, version 4.3, will bring important additions to the company's flagship product, including support for Visual Studio 2012 and support for RESTful web service development, as well as performance enhancements, the company says.
Rocket U2 has announced DataVu V2.2, which strengthens connectivity with UniData and UniVerse (U2) and adds team authoring capability, mobile access and an optional text analytics capability for unstructured data. DataVu delivers real-time data access and interactive drill-downs provide the right information — at the right time — to desktop, web, and mobile users. It also enables fresh information to be concurrently drawn in real-time from across the enterprise — from transactional data sources, data warehouses, or from the content of structured files.