Melissa Data Corp, a developer of data quality and address management solutions, has announced that customers can now access detailed property and mortgage data on more than 140 million U.S. properties by using the company's new WebSmart Property Web Service. The comprehensive solution is available for sourcing nearly any information on a given property - from parcel and owner information to square footage to zoning and more. The information provided by the service is all publicly available information that Melissa Data is compiling from various databases, Greg Brown, director of marketing for Melissa Data, tells DBTA. The service is expected to be particularly useful for property investors, mortgage and refinancing lenders, developers, real estate professionals, risk managers, insurance agencies, and companies looking to target market products and services to homeowners.
Posted April 27, 2011
"Big data" has emerged as an often-used catch phrase over the past year to describe exponentially growing data stores, and increasingly companies are bolstering their product lines to address the challenge. But helping companies manage and derive benefit from the onslaught of data has consistently been the focus for MarkLogic Corporation, whose flagship product, MarkLogic Server, is a purpose-built database for unstructured information. The company recently announced Ken Bado as its new chief executive officer and a member of the board of directors. In terms of new directions for the company as he takes the reins, Bado says, "First of all, you are going to see a much more aggressive message from MarkLogic with respect to unstructured and specifically ‘big data.' " In addition, there will also be changes seen in the company's go-to-market approach, he says. "Right now, our business model is a direct model through an enterprise-type selling machine, that has been quite effective in getting us to where we are, but there are three other levels that we need to address pragmatically to help us build scale and grow."
Posted April 27, 2011
Application Security, Inc. (AppSec), a provider of database SRC solutions for the enterprise, and Securosis, a security research and analysis firm, have partnered to provide what they are describing as the industry's first comprehensive guide to quantifying enterprise database security processes. "What we wanted to do was go to some of the experts in the industry who have not only been analysts but also lived in this environment and have them systematically go through the process and document everything from organizational considerations down to specific steps and then provide a means to quantify the man hours, the expenses, and the technologies associated with each step in this process," Thom VanHorn, vice president of marketing, AppSec, tells DBTA.
Posted April 27, 2011
Database Trends and Applications (DBTA) met with Oracle Applications Users Group (OAUG) president Mark C. Clark during the recent COLLABORATE 11 conference in Orlando, Florida. Clark, who became president of the users group earlier this year, is a senior partner for O2Works, which specializes in configuring the Oracle Applications to adhere to best practices and to streamline business operations. Now, more than 2 years following the financial meltdown of late 2008, it is clear that more users are again out attending COLLABORATE. "We have gone through a period of very tight IT budgets, a 2-to-4 year phase of maintenance. Everybody I am talking to is looking at opportunities to do projects this year. And if they aren't doing it this year, they are planning for it next year," said Clark, commenting on the renewed enthusiasm for attending the conference.
Posted April 27, 2011
Big data is one of those terms that is quickly gaining momentum among technologists. If you watch closely, you'll notice that everyone seems to have an opinion on what "big data" means and wants to own the term. As industry experts discuss what to name this problem, in 2011, companies will be tasked with bringing big data from back office offline analytics to customer-facing 24x7 production systems. Customers are paying attention and they need solutions that support not only massive data sets but also mixed information types, extended feature sets, real-time processing, and technical teams that have not hand-coded these systems from the ground up. Here are five big data solution trends we see developing as our customers work hard to solve "big data" or "big information" problems.
Posted April 05, 2011
Organizations today are beginning to understand that, second to their employees, data is their most critical asset. Consequently, they need to approach data management as they approach capital management - by employing disciplined methodologies utilizing automation and actionable intelligence. Once employed, these methodologies secure and protect data in a scalable and repeatable fashion, without requiring additional intervention from IT personnel or disturbing business processes. In the age of information overload, with the explosive growth of unstructured and semi-structured data, best practices help organizations of all sizes effectively manage, control and protect this valuable asset.
Posted April 05, 2011
On the surface, the idea of using a single source integrator to implement SAP's Enterprise Resource Planning (ERP) software seems ideal. The appeal lies in the potential to make an incredibly complex project appear simple. The single source model promises the ease of having only one vendor to pay, only one team to work with and a single source of accountability should things go wrong. Yet what works in theory doesn't always bear out in real world applications.
Posted April 05, 2011
A member of the Oracle Applications Users Group (OAUG) since 1992, Mark C. Clark recently took over as president of the organization. He spoke with DBTA about what's in store for members at the annual Oracle users conference COLLABORATE as well as for the year ahead. Helping members prepare for an upgrade to Oracle Applications Release 12, providing additional smaller, more targeted regional events, and a continued emphasis on a return to the basics with networking and education are at the top of his to-do list for 2011.
Posted March 23, 2011
McAfee has announced its intention to acquire Sentrigo, a privately owned provider of database security and compliance, assessment, monitoring and intrusion prevention solutions. In addition, McAfee has also announced a comprehensive database security solution to protect business-critical databases without impacting performance and availability. McAfee's coordinated approach based on the Security Connected initiative launched in October 2010, involves protecting a company's most important data assets from network to server to the database itself, resulting in data being protected in every state (data in motion, data at rest, and data in use) via access controls, network security, server security, data protection and encryption - all centrally managed to minimize risk and maximize efficiency.
Posted March 23, 2011
Revolution Analytics, a commercial provider of software and services based on the open source R project for statistical computing, and IBM Netezza announced they are teaming up to integrate Revolution R Enterprise and the IBM Netezza TwinFin Data Warehouse Appliance. According to the vendors, this will enable customers to directly leverage the capabilities of the open source R statistics language as they run high-performance predictive analytics from within data warehouse platforms.
Posted March 23, 2011
Despite highly publicized data breaches, ranging from the loss of personally identifiable information such as credit card and Social Security numbers at major corporations to the WikiLeaks scandal involving sensitive U.S. Department of Defense and U.S. State Department information, and the "alphabet soup" of compliance regulations, data around the globe remains at grave risk, according to John Ottman, president and CEO of Application Security, Inc., who has written "Save the Database, Save the World" to focus attention on the problem and present steps to its solution. While super secure networks are important, that alone is far from enough and a layered data security strategy with a commitment to "protecting data where it lives - in the database" must be pursued to avoid risks posed by outside hackers as well as authorized users, says Ottman. A stronger government hand may be needed as well to defend "the critical infrastructure that operates in the private sector," he suggests.
Posted March 23, 2011
Data continues growing rapidly, flowing into enterprises from traditional sources as well as new pipelines fueled by web and social media. Often presented in a range of formats and structures, this data onslaught phenomenon has come to be known as "big data." Companies, educational institutions, and government agencies are striving to meet the management challenge of this data deluge as well as mine this wealth of information for business advantage. In this special section, DBTA asks key vendors to explain their strategies for enabling customers to better handle ever-increasing data stores.
Posted March 09, 2011
NoSQL Option: Triplestore Databases
Posted March 09, 2011
The recent public release of thousands of leaked U.S. State Department cables by WikiLeaks continues to shake up governments across the world. The information captured and sent out to the wild is not only an embarrassment to U.S. government officials whose candid assessments of foreign leaders were exposed but also to the fact that that the organization with the tightest and most comprehensive data security technologies, protocols, and policies in the world unknowingly fell victim to a massive data breach. Can private corporations or smaller government agencies with less-stringent security protocols and standards expect to do any better? Securing data is tough enough, and now, with the increase of initiatives such as virtualization and cloud computing, the odds of loss of control and proliferation of sensitive data become even greater.
Posted March 09, 2011
A new survey of database administrators and managers reveals that a pervasive culture of complacency hampers information security efforts, and as a result of lax practices and oversight, sensitive data is being left vulnerable to tampering and theft. While tools and technologies provide multiple layers of data security both inside and outside the firewall, organizations appear to lack the awareness and will to make security stick. The study, "Data in the Dark: Organizational Disconnect Hampers Information Security," was conducted by Unisphere Research among 761 members of PASS, the Professional Association for SQL Server, in September 2010. The survey was fielded in partnership with Application Security, Inc.
Posted March 09, 2011
A new survey of 430 members of the Oracle Applications Users Group (OAUG) reveals that organizations lack a sense of urgency about securing critical data, and the greatest challenges to securing application and data environments are primarily organizational and budget-related. The survey was conducted by Unisphere Research, a division of Information Today, Inc., in partnership with Application Security, Inc. (AppSec), a provider of database security, risk and compliance solutions, in December 2010. According to the OAUG's 2011 Data Security report, "Managing Information in Insecure Times," 53% of respondents stated that budget was the greatest impediment holding back information security efforts. Thirty-three percent claimed a lack of an understanding of the threats prevents them from rallying support for countermeasures. And more than one-quarter of respondents cited a disconnect between IT teams and executive management as a major impediment to implementing proper security measures. The study shows a serious lack of understanding and concern for data and application security in today's organizations, according to Thom VanHorn, vice president global marketing at AppSec. "My take-away from the study is that there is a lack of communication, there is a lack of buy-in at the highest levels, and there is not a focus on implementing best practices," VanHorn says.
Posted February 23, 2011
Cloud and Hadoop - Keys to a Perfect Marriage Explored in New DBTA Webcast On-Demand
Posted February 23, 2011
The market for data warehouse appliances - solutions consisting of integrated software and hardware - is heating up, with new twists emerging from both established and new appliance vendors. Netezza, an early proponent of the appliance approach, was acquired in November 2010 by IBM. Here, Phil Francisco, vice president, product management and product marketing for IBM Netezza, shares his views on what's changing and what's ahead for appliances. Going forward, he anticipates that there will be very specific, vertically-oriented solutions that are built on appliances, which will take into account the kinds of data models and the kind of functionality that is required for industries such as telco, retail, and financial services.
Posted February 23, 2011
The SHARE conference convenes on February 27th in Anaheim, with an agenda packed with industry initiatives and knowledge-sharing on the latest best practices and technology trends. In this Q&A, SHARE president Janet Sun provides her vision for the IBM users group in the coming years. "We see the mainframe as the center of the enterprise IT universe. If you don't think so, try unplugging it," says Sun. "Our organization focuses on enterprise IT, and that includes the mainframe. Today's SHARE membership continues to strive to leverage advances in information technology, and SHARE is a great place to do that."
Posted February 23, 2011
Data growth is driving the use of virtualization within data centers. The virtualization evolution from server to storage to desktop is catching on at many small-to-medium size businesses, as well as at large enterprises. Aimed at providing a better end-user and administrator experience than their physical counterparts, virtualized desktops promise lower cost of acquisition and management with a highly scalable, easy-to-deploy and fully protected environment. However, with virtualization desktop infrastructure (VDI) comes a set of new challenges. Chief among these are storage and server resource allocation and data protection and recovery.
Posted February 02, 2011
IBM announced the latest release of the Informix database server, version 11.7, in October 2010, thus marking the fourth major release since Informix joined the company. One of the most exciting features in Informix 11.7 is the "Flexible Grid." Wouldn't you like to administer multiple servers as easily as a single server? Wouldn't you like to mix different hardware, operating systems, and versions of software? The Informix Flexible Grid provides this capability.
Posted February 02, 2011
DBTA Webcast on How to Improve DB Security with Virtual Patching Now Available on Demand
Posted February 02, 2011
There is a wealth of information, connections and relationships within the terabytes and petabytes of data being collected by organizations on distributed cloud platforms. Utilizing these complex, multi-dimensional relationships will be the key to developing systems to perform advanced relationship analysis. From predictive analytics to the next generation of business intelligence, "walking" the social and professional graphs will be critical to the success of these endeavors.
Posted February 02, 2011
The exponentially increasing amounts of data being generated each year make getting useful information from that data more and more critical. The information frequently is stored in a data warehouse, a repository of data gathered from various sources, including corporate databases, summarized information from internal systems, and data from external sources. Analysis of the data includes simple query and reporting, statistical analysis, more complex multidimensional analysis, and data mining.
Posted January 07, 2011
Business Intelligence (BI) systems are used to improve an enterprise's decision making by combining tools for gathering, storing, accessing, and analyzing business data. While traditional features for querying, reporting, and analytics have long been the core focus of these tools, BI has evolved in recent years to become comprehensive, enterprise-wide platforms, and newer trends, such as self-service BI, have helped to continue interest in this technology.
Posted January 07, 2011
With its January 2010 acquisition of Sun Microsystems, Oracle gained the MySQL open source database management software (DBMS) platform for enterprise IT environments. MySQL is designed to let users design and manage complex applications and data sets, and has gained a substantial share of the overall DBMS market.
Posted January 07, 2011
DBTA Hadoop Webcast Now Available on Demand
Posted January 07, 2011
When designing a system an architect must conform to all three corners of the CIA (Confidentiality, Integrity and Accessibility) triangle. System requirements for data confidentiality are driven not only by business rules but also by legal and compliance requirements. As such, the data confidentiality (when required) must be preserved at any cost and irrespective of performance, availability or any other implications. Integrity and Accessibility, the other two sides of triangle, may have some flexibility in design.
Posted January 07, 2011
As security threats increase and become more sophisticated, organizations face pressure to implement strong processes and technology solutions to ensure compliance and the safety of critical assets. The risks associated with a data breach can be devastating, regardless of whether it is due to a simple mistake, or a stolen end-point device such as a laptop. The impact goes beyond fines and lost revenue, to negatively impacting an organization's brand identity and equity, or jeopardizing customers' trust. Providing greater clarity, as well as aligning with industry changes and best practices, Version 2.0 of the PCI DSS standard went into effect earlier this month.
Posted January 07, 2011
The idea of moving off IMS might seem compelling at first glance, but once you look at the whole picture, you might think otherwise. Most people think of cost as the primary reason to move off IMS. But if you look at all of the comparative costs of IMS on a mainframe against a WINDOWS/UNIX solution, you will find that running IMS is actually cost-effective. The obvious cost elements are hardware and software and the huge expense of converting hundreds of thousands of lines of code and hundreds of databases. However, these are only a small part of the story.
Posted January 07, 2011
These days, many companies recognize that there are severe repercussions to ignoring or undervaluing data security, and a sizable segment of organizations-at least one-third in many cases-have been taking additional measures to bolster their data security.
Posted November 30, 2010
When Data Virtualization?
Posted November 30, 2010
One common challenge I have observed during ITIL service catalog implementations pertains to the handling of out-of-band requests. That is, how should one manage a request for a service that is not in the catalog?
Posted November 30, 2010
The year 2010 brought many new challenges and opportunities to data managers' jobs everywhere. Companies, still recovering from a savage recession, increasingly turned to the power of analytics to turn data stores into actionable insights, and hopefully gain an edge over less data-savvy competitors. At the same time, data managers and administrators alike found themselves tasked with managing and maintaining the integrity of rapidly multiplying volumes of data, often presented in a dizzying array of formats and structures. New tools and approaches were sought; and the market churning with promising new offerings embracing virtualization, consolidation and information lifecycle management. Where will this lead in the year ahead? Can we expect an acceleration of these initiatives and more? DBTA looked at new industry research, and spoke with leading experts in the data management space, to identify the top trends for 2011.
Posted November 30, 2010
If data is the lifeblood of an enterprise, a robust master data management (MDM) solution may well be the heart, pumping purified data downstream to vital applications and databases while simultaneously accepting inaccurate and old data for cleansing and enrichment. This "bloodstream," as we know it, is comprised of a myriad of different subject areas, and/or domains. Though the MDM market may well consider itself conceptually and technically mature, end users still struggle to determine whether they should embrace specialist MDM solutions dedicated to supporting one subject area, or make one strategic acquisition and implement truly-multi domain software that addresses multiple subject areas.
Posted November 09, 2010
Leveraging Data Models for Business Intelligence and Data Warehousing Agility
Posted November 09, 2010
There has been a lot of interest lately in NoSQL databases and, of course, many of us have strong backgrounds and experience in traditional relational "SQL" databases. For application developers this raises questions concerning the best way to go. One recurring truth that eventually surfaces with all new software technologies is that "one size does not fit all." In other words, you need to use the right tool for the job, as each has its own strengths and weaknesses. In fact, a danger of many new architectural approaches is one of "over-adoption" - using a given tool to address a wide array of situations when originally they were designed for the specific problem domain in which they excel.
Posted November 09, 2010
When IBM developers set out to build the next version of Informix their goal was to build on the foundation of one of the more mature, effective and reliable pieces of information management software in the industry. With the 10th anniversary of the IBM acquisition of Informix fast approaching, they knew that the 11.7 release would be closely watched by clients and partners alike.
Posted October 12, 2010
Cloud computing offers the promise of greater agility, resource optimization, and user performance, yet many businesses are understandably leery about jumping onto the cloud bandwagon until they have assurances that hosted resources will be secure. In fact, security concerns are the main obstacle to widespread cloud computing adoption among enterprises today. Before taking advantage of these capabilities, businesses need to assure users they have a simple way to access all their applications, and trust that their information is secure in the cloud.
Posted October 12, 2010
The flood of digital information increases the need for accuracy - including knowing which data to leave out. Remember when we used to ride around in our cars and listen to AM radio? Maybe you're not quite old enough to remember, but there was a time when AM radio was all we had - and that was fine. There also used to be only a handful of television channels, which we had to get up out of our chairs to change. That was fine, too. We didn't long for a wider variety of music on the radio, or more channels to watch on TV. We had what we had, and it was all fine - it was all "good enough."
Posted October 12, 2010
The relational database - or RDBMS - is a triumph of computer science. It has provided the data management layer for almost all major applications for more than two decades, and when you consider that the entire IT industry was once described as "data processing," this is a considerable achievement. For the first time in several decades, however, the relational database stranglehold on database management is loosening. The demands of big data and cloud computing have combined to create challenges that the RDBMS may be unable to adequately address.
Posted October 12, 2010
InterSystems Corporation has rolled out a new version of its Caché high-performance object database. The new release targets the growing demand by CIOs for economical high availability by introducing database mirroring, while also addressing Java developers' need for high-volume high-performance processing combined with persistent storage for event processing systems. Robert Nagle, InterSystems vice president for software development, recently chatted with DBTA about the release and the new features it offers. Commenting on the growing interest in NoSQL databases, Nagle observes that many of the beneficial characteristics people see in NoSQL are in fact true of Caché - a flexible data model and zero DBA cost. "But for us, what is unique is that it is not NoSQL, it is that it needs to be SQL - without the overhead of relational technology - because I think SQL is extremely important for almost every class of application that is deployed."
Posted October 12, 2010
Oracle is a fast-changing company, and in recent years, its pace has accelerated to blinding speed. The software giant has expanded well beyond its relational database roots to encompass applications, management tools, service-oriented architecture and middleware, and even hardware. There are now many components to Oracle - from three major databases, to enterprise resource applications, to web applications to development languages to open source desktop tools.
Posted September 07, 2010
Organizations turn to master data management (MDM) to solve many business problems - to reach compliance goals, improve customer service, power more accurate business intelligence, and introduce new products efficiently. In many cases, the need for an MDM implementation is dictated by the business challenge at hand, which knows no single data domain. Take a manufacturing customer, for example. The company decided to deploy an MDM solution in order to solve buy-side and sell-side supply chain processes, to more effectively manage the procurement of direct and indirect materials and to improve the distribution of products. To meet these goals the solution must be capable of managing vendor, customer, material and product master data. Unfortunately, quite a few vendors sell technology solutions that focus exclusively on either customer data integration (CDI) or product information management (PIM), which solves only a piece of the business problem.
Posted September 07, 2010
Brent Ozar achieved SQL Server 2008 Master status earlier this year, becoming the fifth person in the U.S. outside of Microsoft to achieve the company's highest technical certification. A Quest Software SQL Server expert at the time, Ozar has since joined SQLskills.com, a provider of training and consulting focused on Microsoft SQL Server, as a principal consulting partner. In this issue, he provides an arcane gliimpse into the intense 3-week-long onsite program that include the most difficult exams he had ever seen.
Posted September 07, 2010
Many organizations now have, in their possession, the sophisticated analysis tools and dashboards that connect to back-end systems and enable them to peer deeply into their businesses to assess progress on all fronts-from revenues to stock outs to employee performance. However, a recent survey of 279 Oracle applications managers reveals that when it comes to decision making, simple spreadsheets still remain the tool of choice. And business users still wait days, weeks, and months for their IT departments to deliver reports, despite significant investments in performance management systems.
Posted September 07, 2010
IBM has entered into a definitive agreement to acquire Storwize, a privately held company based in Marlborough, Mass. Storwize provides real-time data compression technology to help clients reduce physical storage requirements by up to 80%, improving efficiency and lowering the cost of making data available for analytics and other applications. With Storwize, IBM says, it is acquiring storage technology that is unique in the industry due to its ability to compress primary data, or data that clients are actively using, of multiple types - from files to virtualization images to databases - in real-time while maintaining performance. "This is in contrast to what we see our competitors doing, which is primarily focusing on compressing data that is inactive, or data at rest - backup data, as an example," explained Doug Balog, vice president of IBM Storage, during a conference call announcing the planned acquisition.
Posted August 10, 2010
First elected to Oracle Applications Users Group board of directors in 2009, David Ferguson became president of the OAUG this year. He talks with DBTA this month about how the users group is getting "back to basics" with educational sessions and networking opportunities as well as the new approaches it is taking to meet its members' evolving needs.
Posted August 10, 2010
Earlier this year, Andy Flower took over as president of the Independent Oracle Users Group from Ian Abramson. With Oracle OpenWorld right around the corner, Flower talks with DBTA about how the IOUG is changing to best meet the challenges and opportunities presented by the expanding Oracle ecosystem, despite what continues to be a difficult economy. For the IOUG, it is "the year of the member" and it all starts with the database, he says.
Posted August 10, 2010
Many see 2010 shaping up as a boom year for cloud computing, with cloud adopters capable of realizing significant reductions in administrative IT costs compared to non-adopters. However, it's not enough to simply develop and implement a cloud strategy. Rather, enterprises must take into account the performance of their cloud-based assets and the impact of the cloud on their end users' and customers' experiences. After all, the apparent cost and elasticity advantages of the cloud won't yield any business benefit if the direct consequence is a poor end user experience. For this reason, businesses considering the cloud must do the due diligence and insist on performance guarantees from cloud service providers that map directly to business objectives - or risk impacting revenue, brand image and customer satisfaction.
Posted August 10, 2010