DBTA E-EDITION
June 2012
Subscribe to the online version of Database Trends and Applications magazine. DBTA will send occasional notices about new and/or updated DBTA.com content.
Trends and Applications
Companies are scrambling to learn all the various ways they can slice, dice, and mine big data coming in from across the enterprise and across the web. But with the rise of big data — hundreds of terabytes or petabytes of data — comes the challenge of where and how all of this information will be stored. For many organizations, current storage systems — disks, tapes, virtual tapes, clouds, inmemory systems — are not ready for the onslaught, industry experts say. There are new methodologies and technologies coming on the scene that may help address this challenge. But one thing is certain: Whether organizations manage their data in their internal data centers, or in the cloud, a lot more storage is going to be needed. As Jared Rosoff, director of customer engagement with 10gen, puts it: "Big data means we need ‘big storage.'"
As data continues to grow unabated, organizations are struggling to manage it more efficiently. By better leveraging their expanding data stores and making the information available more widely, organizations hope to put big data to work — helping them to achieve greater productivity and more informed decision making, as well as compete more effectively as a result of insights uncovered by analytics on their treasure troves of information. Improving the management of big data is not something to consider addressing at some point in the hazy future — the big data challenge is already here, according to a new survey of 264 data managers and professionals who are subscribers to Database Trends and Applications.
Social media network-based business intelligence represents the next great frontier of data management, promising decision makers vast vistas of new knowledge gleaned from exabytes of data generated by customers, employees, and business partners. Mining data from Facebook, Twitter, blogs, wikis, and internal corporate networks potentially may surface new insights into impending market shifts, patterns in customer sentiment, and competitive intelligence. It's a rich opportunity not lost on today's organizations, a new survey of 711 business and IT managers from across the globe reveals. A majority of respondents are either planning to collect and analyze data from both proprietary and public social media networks, or are doing so already.
Columns - Notes on NoSQL
One of the earliest of the new generation of non-relational databases was CouchDB. CouchDB was born in 2005 when former Lotus Notes developer Damien Katz foresaw the nonrelational wave that only fully arrived in 2009. Katz imagined a database that was fully compatible with web architectures — and more than a little influenced by Lotus Notes document database concepts.
Columns - Database Elaborations
In the dim, dark past of data warehousing, there was a time when the argument was put forward that "history does not change." It was posited that once a piece of data was received by the data warehouse, it was sacrosanct and nonvolatile. A fact record, once processed, was to remain unchanged forever. Dimensions, due to their descriptive nature, could be changed following the prescribed Type 1, 2, or 3 update strategies, but that was all. It was the expectation that due to their very nature, fact tables would become huge and in being huge would give poor update performance; performance so poor that updates would be virtually impossible to enact.
Columns - DBA Corner
Unless you plan for and issue regular COMMITs in your database programs, you will be causing locking problems. It is important for every programmer to issue COMMIT statements in all application programs where data is modified (INSERT, UPDATE, and DELETE). A COMMIT externalizes the modifications that occurred in the program since the beginning of the program or the last COMMIT. A COMMIT ensures that all modifications have been physically applied to the database, thereby ensuring data integrity and recoverability. Failing to code COMMITs in a data modification program is what I like to call "Bachelor Programming Syndrome" — in other words, fear of committing.
Columns - SQL Server Drill Down
By now, you've heard that Microsoft has publicly released SQL Server 2012. I have to be honest in telling you that it came sooner than I expected, despite my many inside connections at Microsoft. I was fully expecting the RTM to occur a bit before summer, just in time for a spectacular launch at Microsoft TechEd.
MV Community
Sharon Shelton, vice president of marketing at Entrinsik, showcases Hawaii Human Resources (HiHR), a company using HRPyramid and Entrinsik Informer to help provide customized HR solutions to nearly 250 clients, in a new blog post this month on the Entrinsik website. Prior to purchasing Informer, HIHR was limited to the stock reports and unable to extract the specific data needed for internal and external reporting purposes, Shelton explains. But, since deploying Informer, HiHR has created almost 200 reports, including basic census, 401K reporting, and code file listings, and the company also makes heavy use of more advanced Informer functionality, such as Live Excel.
Kore Technologies, a provider of enterprise integration, business intelligence and e-commerce web solutions for MultiValue and Microsoft SQL Server databases, is partnering with Paciolan, a provider of ticketing, marketing, and fundraising solutions, and SSB Consulting Group to create a SQL-based data warehouse solution called Paciolan Ticketing Intelligence. The new solution will provide Paciolan's clients with real-time dashboards, cross-systems reporting and direct data warehouse access to help them make data-driven business decisions.
BlueFinity International has announced that Marine pilots at Port Castries, Saint Lucia, are now using a new electronic touch screen system, developed for them by Ideal Business Services Ltd, to track vessels at sea and to access cargo and cruise berthing logistics prior to and after vessel arrival and departure. Ideal developed the new software solution using BlueFinity's mv.NET, a suite of components designed to provide the MultiValue developer with a comprehensive solution to the challenge of creating Microsoft .NET based applications requiring access to MultiValue databases
Earlier this month, Rocket U2 kicked off U2 University in Denver with the first-ever Rocket Day. "We had over 150 people attend the keynote session and business unit overviews before splitting into the three separate technical tracts and two hands-on lab sessions held simultaneously. We released the new U2 Toolkit for .NET (U2NETDK) the Friday prior to the event, and had two very well-attended sessions on this new offering," notes Susie Siegesmund, vice president and general manager, U2 Brand, Rocket Software.