Database Management Articles
Oracle CEO Larry Ellison made three key announcements in his opening keynote at Oracle OpenWorld, the company's annual conference for customers and partners in San Francisco. Ellison unveiled the Oracle Database In-Memory Option to Oracle Database 12c which he said speeds up query processing by "orders of magnitude," the M6 Big Memory Machine, and the new Oracle Database Backup Logging Recovery Appliance. Explaining Oracle's goals with the new in-memory option, Ellison noted that in the past there have been row-format databases, and column-format databases that are intended to speed up query processing. "We had a better idea. What if we store data in both formats simultaneously?"
Posted September 24, 2013
Featuring a multi-tenant architecture that streamlines the process of consolidating databases onto the cloud and enables organizations to manage many databases as one, Oracle Database 12c is a next-generation database. To avoid confusion, however, here are 10 things that Oracle Database 12c is not.
Posted September 24, 2013
Oracle holds an enviable position in the IT marketplace with a wide array of database systems, development tools, languages, platforms, enterprise applications, and servers. Riding the coattails of this industry giant is a healthy and far-flung ecosystem of software developers, integrators, consultants, and OEMs. These are the partners that will help make or break Oracle's struggle with new forces disrupting the very foundations of IT. And lately, Oracle—long known for its own brand of xenophobia and disdain for direct competitors—has been making a lot of waves by forging new alliances with old foes. This is opening up potentially lucrative new frontiers for business partners at all levels.
Posted September 16, 2013
Database management systems support numerous unique date and time functions - and while the date-related functions are many, they do not go far enough. One date-driven circumstance often encountered has to do with objects having a type of date range that needs to be associated with it. While there are some exceptions, this date range need generally ends up implemented via two distinct date columns—one signaling the "start" and the other designating the "end." Maybe, should the creative juices of DBMS builders' flow, such things as numeric-range-datatypes could be created in addition to a date-range data-type. Who knows where things could end up?
Posted September 11, 2013
Many of the new features coming in SQL Server 2014, now available in Community Technology Preview, are encapsulated within broader and rather intuitive categories. The major categories for new features in SQL Server 2014 are Mission-Critical Performance Enhancements, Business Intelligence Insights, and Hybrid Cloud Enhancements. In addition, one of the interesting knock-on effects of retooling SQL Server to run in the cloud is that the code has tightened up a lot.
Posted September 11, 2013
DBAs need to make many different types of changes to a database over its lifetime. Some will be simple and easy to implement, others much more difficult and complex. It is the DBA's job to understand the best way to implement any type of database change, but often, simple changes become more difficult in the real world. Database change management tools help to make this job easier and is one of the first tools acquired by many organizations when they implement a database of any size.
Posted September 11, 2013
Oracle has introduced its latest ZFS Storage Appliances, the ZS3 Series, aimed at enabling customers to improve operational efficiencies, reduce data center costs, and increase business application performance.
Posted September 11, 2013
EnterpriseDB, provider of enterprise-class PostgreSQL products and Oracle database compatibility solutions, has formed a partnership with Zmanda, a provider of open source and cloud backup solutions. Through the partnership, Zmanda has created and certified a NetBackup agent for EnterpriseDB's flagship Postgres Plus Advanced Server.
Posted September 10, 2013
The PostgreSQL Global Development Group has announced the release of the open source relational database system PostgreSQL 9.3 which expands PostgreSQL's reliability, availability, as well as its ability to integrate with other databases.
Posted September 10, 2013
Confio Software, which offers the Ignite line of database performance monitoring tools, has secured the Certificate of Networthiness (CoN) from the United States Department of Defense. Confio's CoN applies to all Ignite solutions for Oracle, Microsoft SQL Server, VMware, DB2 and Sybase.
Posted September 10, 2013
Raima, a provider of database management system technology for both in-memory database usage and persistent storage devices, has announced the availability of Raima Database Manager (RDM) version 12.0, optimized for embedded, real-time, in memory and mobile applications.
Posted September 03, 2013
SAP AG introduced new high availability and disaster recovery functionality with SAP Sybase Replication Server for SAP Business Suite software running on SAP Sybase Adaptive Server Enterprise (SAP Sybase ASE). "After only a year and a quarter supporting the Business Suite, ASE has already garnered about 2,000 customer installations. This easily provides that near zero-downtime for HA/DR that is non-intrusive to the system using Replication Server as the key enabling technology," said Dan Lahl, vice president, Database Product Marketing, SAP, in an interview.
Posted August 31, 2013
SAP has launched Sybase ASE (Adaptive Server Enterprise) 15.7 service pack 100 (SP100) to provide higher performance and scalability as well as improved monitoring and diagnostic capabilities for very large database environments. "The new release adds features in three areas to drive transactional environments to even more extreme levels. We really see ASE moving increasingly into extreme transactions and to do that we have organized the feature set around the three areas," said Dan Lahl, vice president, Database Product Marketing, SAP, in an interview with 5 Minute Briefing.
Posted August 31, 2013
To help developers and administrators better manage their dynamic data environments, Oracle has released MySQL Workbench 6.0. MySQL Workbench is a unified visual tool that provides data modeling, SQL development, and comprehensive administration for server configuration, user administration, and migration. The new release is a major update that addresses over 200 inputs and requests from the community and is intended to make it easier for administrators and developers to design, develop, and manage their MySQL databases.
Posted August 21, 2013
GenieDB has announced the launch of the GenieDB Globally Distributed MySQL-as-a-Service. The DBaaS offering allows organizations to take advantage of GenieDB's automated platform to build web-scale applications that gain the benefits of geographical database distribution, continuous availability during regional outages, and better application response time for globally distributed users.
Posted August 20, 2013
NuoDB has announced the last release of its current product version and a technology preview of some upcoming second-generation features available later in 2013. The preview is contained in the free download of the new NuoDB Starlings Release 1.2. The NewSQL approach is gaining greater acceptance, said Barry Morris, founder and CEO of NuoDB, in an interview. "What people are saying back to us is that they are getting all of the features of NoSQL without throwing SQL or transactions away. And that concept is becoming the popular notion of what NewSQL is."
Posted August 13, 2013
Just about every company with a DBMS has that binder full of corporate and/or IT standards. That one over there in the corner with the cobwebs on it — the one that you only use when you need an excuse to avoid work. Okay, well, maybe it's not quite that bad. Your standards documents could be on the company intranet or some other online mechanism (but chances are there will be virtual cobwebs on your online standards manuals, too).
Posted August 07, 2013
I was recently chatting with a good friend of mine who's very highly placed in the Microsoft SQL Server team. Our conversation was wide ranging and covered a lot of topics, such as internal features and upcoming announcements. (I'm under at least three different NDA's. So don't expect me to give up anything too juicy or gossipy.) For example, we spent quite a while discussing the ton of great new features and improvements just over the horizon with the recent release of SQL Server 2014 CTP1.
Posted August 07, 2013
A former colleague is looking for a database server to embed into an important new factory automation application his company is building. The application will manage data from a large number of sensor readings emanating from each new piece of industrial equipment his company manufactures. These values, such as operating temperature, material thickness, cutting depth, etc., fit into the data category commonly called "SCADA" - supervisory control and data acquisition. Storing, managing and analyzing this SCADA data is a critical enhancement to this colleague's new application. His large customers may have multiple locations worldwide and must be able to view and analyze the readings, both current and historical, from each piece of machinery across their enterprise.
Posted August 07, 2013
One of the principles within relational theory is that each entity's row or tuple be uniquely identifiable. This means the defined structure includes some combination of attributes whose populated values serve to identify an individual row within the table/relation. This, or these, attribute(s) are the candidate key(s) for the structure. The candidate key is also known as the primary key, or if a structure has multiple candidate keys, then one of them is designated as the primary key. When building up a logical design, primary keys should be identified by the actual data points in play.
Posted August 06, 2013
Just about every company with a DBMS has that binder full of corporate and/or IT standards. That one over there in the corner with the cobwebs on it - the one that you only use when you need an excuse to avoid work. Okay, well, maybe it's not quite that bad. Your standards documents could be on the company intranet or some other online mechanism (but chances are there will be virtual cobwebs on your online standards manuals, too).
Posted August 05, 2013
NuoDB, provider of a NewSQL database, has announced the beta program for a new tool that facilitates migration from MySQL, Microsoft SQL Server, IBM DB2, PostgreSQL and Oracle RDBMSs. The migration tool is open source and available for free download on the NuoDB DevCenter or on GitHub.
Posted July 30, 2013
IBM says it is accelerating its Linux on Power initiative with the new PowerLinux 7R4 server as well as new software and middleware applications geared for big data, analytics and next generation Java applications in an open cloud environment. According to IBM, the new PowerLinux 7R4 server, built on the same Power Systems platform running IBM's Watson cognitive computing solution, can provide clients the performance required for the new business-critical and data-intensive workloads increasingly being deployed in Linux environments. IBM is also expanding the portfolio of software for Power Systems with the availability of IBM Cognos Business Intelligence and EnterpriseDB database software, each optimized for Linux on Power.
Posted July 30, 2013
After four years of operating BigCouch in production, Cloudant has merged the BigCouch code back into the open source Apache CouchDB project. Cloudant provides a database-as-a-service and CouchDB serves as the foundation of Cloudant's technology. The company developed BigCouch, an open source variant of CouchDB, to support large-scale, globally distributed applications.There are three main reasons Cloudant is doing this, Adam Kocoloski, co-founder and CTO at Cloudant, told 5 Minute Briefing in an interview.
Posted July 30, 2013
Oracle has announced the latest 12c releases of its Cloud Application Foundation, which integrates application server and in-memory data grid capabilities into a foundation for cloud computing, representing "a major release of our middleware infrastructure," Mike Lehmann, Oracle vice president of product management, tells 5 Minute Briefing. The focus for the products is to provide mission-critical cloud infrastructure and a lot of work has been done around native cloud capabilities, says Lehmann.
Posted July 17, 2013
The Oracle database provides intriguing possibilities for the storing, manipulating and streaming of multimedia data in enterprise class environments. However, knowledge of why and how the Oracle database can be used for multimedia applications is essential if one is to justify and maximize the ROI.
Posted July 17, 2013
MemSQL, a provider of real-time analytics, announced the availability of MemSQL 2.1, which includes new features and enhancements to enable customers to access, explore and increase the value of data, regardless of size or file format. To meet the demands posed by increasing amounts of data and data types, MemSQL has updated its analytics platform to enable customers to receive real-time results on analytical queries across both real-time and historical datasets.
Posted July 16, 2013
Oracle Database 12c is available for download from Oracle Technology Network (OTN). First announced by Oracle CEO Larry Ellison during his keynote at Oracle OpenWorld 2012, Oracle Database 12c introduces a new multi-tenant architecture that simplifies the process of consolidating databases onto the cloud; enabling customers to manage many databases as one - without changing their applications. During the OpenWorld keynote, Ellison described Oracle Database 12c as "the first multi-tenant database in the world" and said it provides "a fundamentally new architecture" to "introduce the notion of a container database" with the ability to plug in multiple separate, private databases into that single container.
Posted July 09, 2013
Dell Software has introduced the latest version of the Dell KACE K1000 Management Appliance, which now includes integrated software asset management to boost software license compliance, while helping lower IT costs. The K1000 adds automated software asset identification, tracking and optimization to its capabilities for managing the deployment, operation and retirement of software assets. The need for the appliance is being fueled by a range of factors, including the influx of new technologies such as cloud computing, virtualization, and BYOD, which are adding complexity in terms of systems management, Lisa Richardson, senior product marketing manager for Endpoint Systems Management, Dell Software, tells 5 Minute Briefing.
Posted July 08, 2013
SAP AG announced this week that version 16 of the company's Sybase software has achieved a Guinness World Record for loading and indexing big data. In cooperation with BMMsoft, HP and Red Hat, SAP Sybase IQ 16 achieved an audited result of 34.3 terabytes per hour, surpassing the previous record of 14 terabytes per hour achieved by the same team using an earlier version of SAP Sybase IQ. The latest version of th real-time analytics server and enterprise data warehouse (EDW) provides a new, fully parallel data loading capability and a next-generation column store, enabling the jump in big data performance.
Posted June 27, 2013
RainStor, a provider of an enterprise database for managing and analyzing historical data, says it has combined the latest data security technologies in a comprehensive product update that has the potential to rapidly increase adoption of Apache Hadoop for banks, communications providers and government agencies.
Posted June 27, 2013
The amount of data being generated, captured and analyzed worldwide is increasing at a rate that was inconceivable a few years ago. Exciting new technologies and methodologies are evolving to address this phenomenon of science and culture creating huge new opportunities. These new technologies are also fundamentally changing the way we look at and use data. The rush to monetize "big data" makes the appeal of various "solutions" undeniable.
Posted June 27, 2013
Oracle announced the general availability of MySQL Cluster 7.3, which adds foreign key support, a new NoSQL JavaScript Connector for node.js, and an auto-installer to make setting up clusters easier. MySQL Cluster is an open source, auto-sharded, real-time, ACID-compliant transactional database with no single point of failure, designed for advanced web, cloud, social and mobile applications. "Foreign key support has been a longstanding feature request from day-one," Tomas Ulin, vice president of MySQL Engineering at Oracle, tells 5 Minute Briefing.
Posted June 19, 2013
Dell has released Toad for Oracle 12.0 which provides developers and DBAs with a key new capability - a seamless connection to the Toad World user community so they will no longer have to exit the tool and open a browser to gain access to the community. "The actual strength of the product has always been the input of users," John Whittaker, senior director of marketing for the Information Management Group at Dell Software, tells 5 Minute Briefing. The new ability to access the Toad World community from within Toad enables database professionals to browse, search, ask questions and start discussions directly in the Toad forums, all while using Toad.
Posted June 19, 2013
These are heady times for data products vendors and their enterprise customers. When business leaders talk about success these days, they often are alluding to a new-found appreciation for their data environments. It can even be said that the tech vendors that are making the biggest difference in today's business world are no longer software companies at all; rather, they are "data" companies, with all that implies. Enterprises are reaching out to vendors for help in navigating through the fast-moving, and often unforgiving, digital realm. The data vendors that are leading their respective markets are those that know how to provide the tools, techniques, and hand-holding needed to manage and sift through gigabytes', terabytes', and petabytes' worth of data to extract tiny but valuable nuggets of information to guide business leaders as to what they should do next.
Posted June 19, 2013
The grain of a fact table is derived by the dimensions with which the fact is associated. For example, should a fact have associations with a Day dimension, a Location dimension, a Customer dimension, and a Product dimension, then the usual assumption would be for the fact to be described as being at a "by Day," "by Location," "by Customer," "by Product" metrics level. Evidence of this specific level of granularity for the fact table is seen by the primary key of the fact being the composite of the Day dimension key, Location dimension key, Customer dimension key, and Product dimension key. However, this granularity and these relationships are easily disrupted.
Posted June 13, 2013