Data Modeling Articles
Changes and enhancement to solutions are hard, even under the best of circumstances. It is not usual that, as operational changes roll out into production, the business intelligence area is left uninformed, suggesting that data warehouses and business intelligence be categorized according to the view of the old comedian Rodney Dangerfield because they both "get no respect."
Posted January 07, 2014
The newest release of Oracle SQL Developer, Oracle's integrated development environment, optimizes development and database administration for Oracle Database 12c and expands automation of third-party migrations to Oracle. Addressing the need for user-friendly tools to speed and simplify development and data management activities, Oracle is seeking to increase productivity for database development tasks so organizations can fully capitalize on the power of enterprise data.
Posted December 23, 2013
SAP is strengthening its commitment to the developer community with key open source contributions, a real-time development experience for SAP HANA, and the publication of a new unified developer license.
Posted December 18, 2013
The latest release of CA ERwin Data Modeler, a solution for collaboratively visualizing and managing business data, addresses two major objectives - the need for organizations to manage more data across more platforms, and to easily share that data with an expanding number of users with a range of roles and skill sets.
Posted December 17, 2013
OpenText, a provider of Enterprise Information Management (EIM) software, has announced Project Red Oxygen, which the company describes as a "harmonized" release of new EIM software advancements designed to give CIOs the ability to extract value from their enterprise information and accelerate competitive advantage.
Posted December 02, 2013
Two new approaches to application quality have emerged: "risk-based testing" - pioneered in particular by Rex Black - and "exploratory testing" - as evangelized by James Bach and others. Neither claim to eradicate issues of application quality, which most likely will continue as long as software coding involves human beings. However, along with automation of the more routine tests, these techniques form the basis for higher quality application software.
Posted November 20, 2013
Changes to database structures should be performed in a coordinated fashion as the application processes that support the new functionality are rolled out into production. While the "work" involved in adding a column or a table to a relational database is actually minimal, often there are circumstances where developers and DBAs create additional columns and additional tables in anticipation of future needs. Sadly, this "proactive" effort results in databases littered with half-formed ideas, fits-and-starts, and scattered-about columns and tables that provide no meaningful content.
Posted November 13, 2013
Constantly changing tax rules can make payroll deductions and tax payments a time-consuming and costly endeavor for businesses. To get this onerous job done efficiently and cost-effectively, many utilize payroll software specialists that provide tools to support their in-house staff. Read how Revelation Software's OpenInsight and OpenInsight for Web are giving Ardbrook, a Dublin, Ireland-based software provider of payroll software, the agility it needs.
Posted October 23, 2013
Oracle has released the latest version of VM VirtualBox which provides a virtual multi-touch user interface, supports additional devices and platforms, and offers enhanced networking capabilities that allow developers to virtualize modern operating system features while maintaining compatibility with legacy operating systems. Designed for IT professionals, Oracle VM VirtualBox is cross-platform virtualization software that enables users to run multiple operating systems at the same time.
Posted October 16, 2013
How does one avoid the semantically wishy-washy use of NULL-surrogates and instead, actually design structures wherein NULLs are not necessary?
Posted October 09, 2013
A successful DBA must understand application development and the issues involved in programming and design. Here are some things that every DBA must know about application development and the design projects of their organization.
Posted October 09, 2013
Database management systems support numerous unique date and time functions - and while the date-related functions are many, they do not go far enough. One date-driven circumstance often encountered has to do with objects having a type of date range that needs to be associated with it. While there are some exceptions, this date range need generally ends up implemented via two distinct date columns—one signaling the "start" and the other designating the "end." Maybe, should the creative juices of DBMS builders' flow, such things as numeric-range-datatypes could be created in addition to a date-range data-type. Who knows where things could end up?
Posted September 11, 2013
Data models attempt to express the business rules of an organization. A good data model reflects the semantics used within an organization to such an extent that business people within that organization can relate to and easily agree with what is being expressed. In this regard the data modeler's goal is to properly mirror back the organization's concepts onto those people within the organization. The goal is not to force an organization into a "standard" data model, nor is the goal to abstract everything in the creation of a master model that will never need to change even if the business rules were drastically shifted.
Posted September 03, 2013
One of the principles within relational theory is that each entity's row or tuple be uniquely identifiable. This means the defined structure includes some combination of attributes whose populated values serve to identify an individual row within the table/relation. This, or these, attribute(s) are the candidate key(s) for the structure. The candidate key is also known as the primary key, or if a structure has multiple candidate keys, then one of them is designated as the primary key. When building up a logical design, primary keys should be identified by the actual data points in play.
Posted August 06, 2013
Dell has released Toad for Oracle 12.0 which provides developers and DBAs with a key new capability - a seamless connection to the Toad World user community so they will no longer have to exit the tool and open a browser to gain access to the community. "The actual strength of the product has always been the input of users," John Whittaker, senior director of marketing for the Information Management Group at Dell Software, tells 5 Minute Briefing. The new ability to access the Toad World community from within Toad enables database professionals to browse, search, ask questions and start discussions directly in the Toad forums, all while using Toad.
Posted June 19, 2013
The grain of a fact table is derived by the dimensions with which the fact is associated. For example, should a fact have associations with a Day dimension, a Location dimension, a Customer dimension, and a Product dimension, then the usual assumption would be for the fact to be described as being at a "by Day," "by Location," "by Customer," "by Product" metrics level. Evidence of this specific level of granularity for the fact table is seen by the primary key of the fact being the composite of the Day dimension key, Location dimension key, Customer dimension key, and Product dimension key. However, this granularity and these relationships are easily disrupted.
Posted June 13, 2013
There is an emerging field of companies looking to take on the challenges presented by the roiling tide of big data. While their visions vary, each has identified a market need that it believes its technology uniquely addresses. Here, DBTA highlights the approaches of 10 companies we think are worth watching.
Posted June 13, 2013
It seems that juggling is the most useful of all skills when embarking on a data warehousing project. During the discovery and analysis phase, the workload grows insanely large, like some mutant science fiction monster. Pressures to deliver can encourage rampant corner-cutting to move quickly, while the need to provide value urges caution in order not to throw out the proverbial baby with the bath water as the project speeds along. Change data capture is one area that is a glaring example of the necessary juggling and balancing.
Posted May 22, 2013
Datawatch Corporation, provider of information optimization solutions, has announced a strategic partnership with Lavastorm Analytics, an analytics software vendor, to provide customers the ability to expand their use of unstructured and semi-structured data sources when developing analytic applications.
Posted May 07, 2013
Dimensions are the workhorses of a multidimensional design. They are used to manage the numeric content being analyzed. It is through the use of dimensions that the metrics can be sliced, diced, drilled-down, filtered and sorted. Many people relate to dimensions by thinking of them as reference tables. Such thoughts aren't exactly accurate. A dimension groups together the textual/descriptor columns within a rationalized business category. Therefore, much of the content coming from relational tables may be sourced from reference tables, but the relationship between each source reference table and the targeted dimension is unlikely to be one-for-one. These grouped-format dimensions often contain one or more hierarchies of related data items used within the OLAP queries supported by the structures.
Posted April 10, 2013
Do not allow well-meaning but confused proponents to obscure concepts related to normalization and dimensional design. Under a normalized approach one usually would not expect for numeric data items and textual data items to fall into different logical relations when connected to the same entity object. Yet within a multidimensional approach that is exactly what happens. Multidimensional design and normal design are not the same, and one should not expect to claim that both approaches were used and that they resulted in the same data model.
Posted March 14, 2013
Establishing a data warehousing or business intelligence environment initiates a process that works its way through the operational applications and data sources across an enterprise. This process focuses not only on identifying the important data elements the business lives and breathes, but the process also tries very hard to provide rationality in explaining these elements to business intelligence users.
Posted February 27, 2013
Sonata Software, an IT consulting and software services provider headquartered in Bangalore, India, has announced its center of excellence (CoE) for Exalytics, Oracle's engineered system designed for high performance data analysis, modeling and planning.
Posted February 20, 2013
Multi-dimensional design involves dividing the world into dimensions and facts. However, like many aspects of language, the term "fact" is used in multiple ways. Initially, the term referred to the table structure housing the numeric values for the metrics to be analyzed. But "fact" also is used to refer to the metric values themselves. Therefore, when the unique circumstances arise wherein a fact table is defined that does not contain specific numeric measures, such a structure is referred to by the superficially oxymoronic characterization of a "factless fact."
Posted February 12, 2013
Within the information technology sector, the term architect gets thrown around quite a lot. There are software architects, infrastructure architects, application architects, business intelligence architects, data architects, information architects, and more. It seems as if any area may include someone with an "architect"status. Certainly when laying out plans for a physical building, an architect has a specific meaning and role. But within IT "architect" is used in a much fuzzier manner.
Posted December 11, 2012
Micro Focus, a provider of enterprise application modernization solutions, announced it is shipping a new release of its COBOL application migration toolset, with support for Microsoft's latest operating system and integrated development environment. Visual COBOL 2.1 also includes many enhancements designed to facilitate an improved application developer experience, as well as deliver an upgrade path for core business applications. "With support for the Windows 8 platform, Visual COBOL provides even greater developer productivity, collaboration, and quality improvements while reducing their costs and time to market," Ed Airey, Micro Focus product marketing director for COBOL Products, tells 5 Minute Briefing.
Posted December 10, 2012
A new educational webcast examines the results of the 2012 IOUG Test, Development & QA Survey, and covers the best practices and issues that it highlights. Mining the data assets being gathered from all corners of their enterprise, including transactions, customer data, employee input, and information about market conditions, has been essential to companies in uncovering new opportunities, but, in the rush to deliver results, many IT and development departments take shortcuts within the testing process, taking live data right out of production environments to run through testing, development and quality assurance processes.
Posted November 21, 2012
In writing a definition for an entity, an attribute, or any other element within a database design, the desired end is a descriptive text that is clear, factual and concise. Semantics are an ambiguous and often painful tool to employ. Balancing the need for clarity against the desire to avoid redundancy can be a juggling act that is hard to accomplish. One might not easily recognize what is complete versus what is lacking, versus what has gone too far. But even so, within a definition if one finds oneself listing valid values and decoding the value's meaning, then one has likely already moved beyond what is "concise." Lists of values easily add bulk and quantity of verbiage into a definition, yet such lists do not usually increase the quality of a definition.
Posted November 13, 2012
The beauty of a truly wonderful database design is its ability to serve many masters. And good database designers are able to empathize with those who will use their designs. In business intelligence settings, three perspectives deserve consideration when composing designs.
Posted November 06, 2012
Software operates the products and services that we use and rely on in our daily lives. It is often the competitive differentiation for the business. As software increases in size, complexity, and importance to the business, so do the business demands on development teams. Developers are increasingly accountable to deliver more innovation, under shorter development cycles, without negatively impacting quality. Compounding this complexity is today's norm of geographically distributed teams and code coming in from third-party teams. With so many moving parts, it's difficult for management to get visibility across their internal and external supply chain. Yet, without early warning into potential quality risks that could impact release schedules or create long term technical debt, there may be little time to actually do something about it before the business or customers are impacted.
Posted October 24, 2012
CA Technologies has announced a major new release of the ERwin data modeling solution. This new release which is the second in less than a year provides a collaborative data modeling environment to manage enterprise data using an intuitive, graphical interface. It helps improve data re-use, optimize system quality, accelerate time-to-benefit and enable appropriate information governance—key objectives for IT organizations serving companies in today's highly competitive and closely regulated markets.
Posted October 16, 2012
It is an understatement to say we're witnessing an example of Moore's Law — which states the number of transistors on a chip will double approximately every two years — as we seek to manage the explosion of big data. Given the impact this new wealth of information has on hundreds of millions of business transactions, there's an urgent need to look beyond traditional insight-generation tools and techniques. It's critical we develop new tools and skills to extract the insights that organizations seek through predictive analytics.
Posted October 16, 2012
An educational and interactive webcast will review the findings of the 2012 IOUG Test, Development and QA Survey and discuss the best practices and issues that it highlights. This IOUG study was conducted by Unisphere Research, a division of Information Today, Inc., and sponsored by IBM. Presented by Kimberly Madia, WW product marketing manager at IBM, and Thomas Wilson, president and CEO, Unisphere Research, the webcast will be held Thursday, September 27, from 12 - 1 PM CDT. Attendees to the webcast will receive a copy of the study report.
Posted September 26, 2012