Newsletters




Data Quality

Solutions and Services for Data Quality include Master Data Management, Data Cleansing, Data Deduplication, Address Verification, Customer Contact Data Management, Customer Relationship Management (CRM), Golden Record Creation, Geocoding, Data Integration, Data Management, and Mailing Software for Adherence to U.S. Postal Regulations.



Data Quality Articles

QueplixCorp., a provider of data integration and data management products, has introduced the new Data Quality Manager for QueCloud, enabling companies to create and maintain data consistency throughout the data migration, integration and management lifecycle with a single cloud-based platform. As a central component of the QueCloud dashboard, Data Quality Manager is tightly coupled with the solution's core data integration and data management functionality.

Posted June 21, 2011

Talend, a developer and distributor of open source middleware, has announced Talend Cloud, a cloud-enabled integration platform that provides a unified integration platform for on-premise systems, cloud-based systems and SaaS applications. Based on Talend's Unified Integration Platform, it also provides a common environment for users to manage the entire lifecycle of integration processes including a graphical development environment, a deployment mechanism and runtime environment for operations and a monitoring console for management - all built on top of a shared metadata repository.

Posted June 21, 2011

Is the day of reckoning for big data upon us? To many observers, the growth in data is nothing short of incomprehensible. Data is streaming into, out of, and through enterprises from a dizzying array of sources-transactions, remote devices, partner sites, websites, and nonstop user-generated content. Not only are the data stores resulting from this information driving databases to scale into the terabyte and petabyte range, but they occur in an unfathomable range of formats as well, from traditional structured, relational data to message documents, graphics, videos, and audio files.

Posted June 08, 2011

Data quality, MDM, and data governance software vendor Ataccama Corporation announced that it has entered into a cooperative partnership with Teradata Corporation, a leader in data warehousing and enterprise analytics. The partnership is aimed at enabling joint customers to improve data quality within their data warehouses. Ataccama is a global software company with headquarters in Prague, and offices in Toronto, Stamford, London, and Munich, and the new partnership with Teradata represents a worldwide geographical relationship, according to Michal Klaus, CEO, Ataccama Corp.

Posted May 24, 2011

Oracle has agreed to acquire Datanomic Limited, a provider of customer data quality software and related applications for risk and compliance screening. According to Oracle, the Datanomic technology combined with Oracle Product Data Quality will provide a complete data quality solution to reduce the cost and complexity of managing data across its customers' businesses. The transaction is expected to close in the first half of calendar year 2011, and Datanomic's management and employees are expected to join Oracle.

Posted April 21, 2011

Trillium Software has formed an alliance with Microsoft to provide integration between the Trillium Software System data quality solution and Microsoft Dynamics CRM 2011 customer relationship management (CRM) software. As a result, Microsoft Dynamics CRM users who choose to leverage integrated Trillium Software data quality services can ensure that global customer data is accurate and fit-for purpose, whether using on-premises or cloud deployment models (utilizing the Windows Azure platform).

Posted April 19, 2011

Melissa Data Corp, a developer of data quality and address management solutions, has announced that customers can now access detailed property and mortgage data on more than 140 million U.S. properties by using the company's new WebSmart Property Web Service. The comprehensive solution is available for sourcing nearly any information on a given property - from parcel and owner information to square footage to zoning and more.

Posted April 12, 2011

Melissa Data Corp, a developer of data quality and address management solutions, today announced completion of CASS Cycle N certification. Certification of Melissa's software comes months ahead of the USPS July 31, 2011 expiration of CASS Cycle M. In order to continue to qualify for postal automation discounts, CASS vendors must deliver CASS Cycle N to their customers beginning May 1, 2011. With CASS Cycle N, SuiteLink, a USPS product that improves mail delivery by adding known secondary information (suite numbers) to business addresses, will be required for processing.

Posted February 15, 2011

When designing a system an architect must conform to all three corners of the CIA (Confidentiality, Integrity and Accessibility) triangle. System requirements for data confidentiality are driven not only by business rules but also by legal and compliance requirements. As such, the data confidentiality (when required) must be preserved at any cost and irrespective of performance, availability or any other implications. Integrity and Accessibility, the other two sides of triangle, may have some flexibility in design.

Posted January 07, 2011

Melissa Data, a developer of high performance data quality and address management solutions, has announced expanded coverage for its GeoCoder Object. Available as a multiplatform API or as part of WebSmart Services, GeoCoder Object now provides accurate location-based information on 95% of all rooftops in the U.S.

Posted December 14, 2010

The IOUG has completed a number of ground-breaking studies in 2010 through the IOUG ResearchWire program. Conducted among IOUG members by Unisphere Research, 2010 IOUG ResearchWire Executive Summaries are available to all on the IOUG website.

Posted December 01, 2010

The year 2010 brought many new challenges and opportunities to data managers' jobs everywhere. Companies, still recovering from a savage recession, increasingly turned to the power of analytics to turn data stores into actionable insights, and hopefully gain an edge over less data-savvy competitors. At the same time, data managers and administrators alike found themselves tasked with managing and maintaining the integrity of rapidly multiplying volumes of data, often presented in a dizzying array of formats and structures. New tools and approaches were sought; and the market churning with promising new offerings embracing virtualization, consolidation and information lifecycle management. Where will this lead in the year ahead? Can we expect an acceleration of these initiatives and more? DBTA looked at new industry research, and spoke with leading experts in the data management space, to identify the top trends for 2011.

Posted November 30, 2010

Sybase has issued the third and final installment of results from a study on the business impacts of effective data. The study benchmarked some of the world's leading companies across a range of vertical industries by measuring the direct correlation between a company's IT investments and overall business performance.

Posted November 17, 2010

At InformaticaWorld last week, Informatica announced the general availability of the latest release of its master data management (MDM) product, Informatica 9 MDM.

Posted November 09, 2010

Estimates put the amount of data in existence at this time at more than a zettabyte (or a trillion gigabytes), which would be the equivalent of 75 billion fully loaded iPads. All this data is streaming into and through enterprises from transactions, remote devices, partner sites and user-generated content, with formats varying from structured, relational data to graphics and videos.

Posted October 06, 2010

Melissa Data, a developer of high performance data quality and address management solutions, showcased the Contact Verification Server at Oracle OpenWorld. Providing a turnkey solution, the appliance is built by Dell and incorporates six WebSmart components for contact data verification and enrichment, including address, phone, and email verification, name parsing, geocoding and change-of-address processing. The server can verify more than 7 million records per hour and additional servers can be clustered together for increased scalability, throughput and redundancy.

Posted October 06, 2010

Melissa Data, a developer of high performance data quality and address management solutions, showcased the Contact Verification Server at the Oracle OpenWorld show in San Francisco. Providing a turnkey solution, the appliance is built by Dell and incorporates six WebSmart components for contact data verification and enrichment, including address, phone, and email verification, name parsing, geocoding and change-of-address processing. The server can verify more than 7 million records per hour and additional servers can be clustered together for increased scalability, throughput and redundancy.

Posted September 28, 2010

Sybase, Inc., an SAP company, has revealed the results of a new report, "Measuring the Business Impacts of Effective Data." The study benchmarks leading enterprises across a wide range of industries by measuring the direct correlation between a company's IT investments and overall business performance.

Posted September 22, 2010

Trillium Software, a business of Harte-Hanks, Inc., has introduced the latest version of the Trillium Software System. Designed to visualize data issues, improve business rule validation and enhance data quality monitoring in operational environments, the new Trillium Software System is aimed at helping business users and analysts, data stewards and IT professionals work together to improve business decisions and outcomes through better, more accurate information.

Posted September 22, 2010

Trillium Software, a business of Harte-Hanks, Inc., has introduced the latest version of the Trillium Software System. Designed to visualize data issues, improve business rule validation and enhance data quality monitoring in operational environments, the new Trillium Software System is aimed at helping business users and analysts, data stewards and IT professionals work together to improve business decisions and outcomes through better, more accurate information.

Posted September 21, 2010

Oracle has introduced Oracle GoldenGate 11g and Oracle Data Integrator Enterprise Edition 11g, new releases of the two products that form the foundation of Oracle's data integration product line. Oracle GoldenGate 11g delivers real-time data integration and continuous availability for mission-critical systems through its low-impact, low-latency data acquisition, distribution and delivery capabilities, and Oracle Data Integrator Enterprise Edition 11g provides loading and transformation of data into a data warehouse environment through its high-performance extract, load and transform (E-LT) technology.

Posted September 14, 2010

Organizations turn to master data management (MDM) to solve many business problems - to reach compliance goals, improve customer service, power more accurate business intelligence, and introduce new products efficiently. In many cases, the need for an MDM implementation is dictated by the business challenge at hand, which knows no single data domain. Take a manufacturing customer, for example. The company decided to deploy an MDM solution in order to solve buy-side and sell-side supply chain processes, to more effectively manage the procurement of direct and indirect materials and to improve the distribution of products. To meet these goals the solution must be capable of managing vendor, customer, material and product master data. Unfortunately, quite a few vendors sell technology solutions that focus exclusively on either customer data integration (CDI) or product information management (PIM), which solves only a piece of the business problem.

Posted September 07, 2010

Melissa Data, a developer of data quality and address management solutions, has released a 25th anniversary special edition catalog. The catalog provides detailed listings on over 60 different Melissa Data products and services including enterprise data quality platforms, developer tools, address management software, mailing lists, and data hygiene services. The catalog also offers links to many white papers on data quality and direct marketing, as well as information on the Melissa Data Independent Software Vendor Program, and their Data Quality Challenge.

Posted July 20, 2010

When integrating data, evaluating objects from multiple sources aids in determining their equivalence. Each source may identify customers, but determining which customer from each system represents the same customer can prove daunting. Sometimes matching things is straight-forward; for example, if all sources should have an accurate social security number or taxpayer ID, success involves simply linking the matching numbers.

Posted July 12, 2010

Everybody seems to agree with the need for organizations to do a better job of protecting personal information. Every week the media brings us reports of more data breaches, and no organization is immune. Hospitals, universities, insurers, retailers, and state and federal agencies all have been the victims of breach events, often at significant costs. State privacy laws such as the new Massachusetts privacy statutes have placed the burden of protecting sensitive information squarely on the shoulders of the organizations that collect and use it. While some managers might view this as yet one more compliance hurdle to worry about, we feel it presents an excellent opportunity to evaluate existing practices and procedures. The good news is that there are some great solutions available today that can help organizations of all stripes address these requirements while at the same time tightening data security practices, streamlining operations, and improving governance.

Posted July 12, 2010

Pervasive Software Inc., a global leader in cloud-based and on-premises data integration software, has formed a partnership with Melissa Data, a provider of data quality software and services including out-of-the-box connectivity to Melissa Data's data quality offerings, giving both companies' customers the benefits of seamless data quality and data integration capabilities.

Posted June 15, 2010

Quality can be a hard thing to define. What is good and what is bad may not be easily identified and quantified. When a data mart accurately reflects data exactly as found in the source, should that be considered a quality result? If the source data is bad, is the data mart of high quality or not? If the data mart differs from the source, when is the difference an improvement of quality and when is said difference evidence of diminished quality? While it may seem self-evident that correcting the source of load data would be the "right" thing to do, in practice that direction is not necessarily self-evident. The reasons supporting this nonintuitive approach are varied. Sometimes changes to the source impact other processes that must not change, or the changes will expose problems that may provoke undesired political fallout, or it may simply be that making the proper adjustments to the source application would prove too costly to the organization. For all these reasons and more, in the world of business intelligence, the dependent data often is expected to be of higher quality than the source data. In order for that improvement to occur, data placed within the dependent mart or data warehouse must be altered from the source. Sometimes these alterations become codified within the process migrating data from the source. Other times changes are made via one-time ad hoc updates. Either way, this alteration leads to a situation in which the dependent data will no longer equate one-for-one to the source data. Superficial comparisons of this altered content will highlight the disparity that what exists for analytics is not the same as what exists for the operational system.

Posted June 07, 2010

Varonis Systems Inc., a provider of data governance software, will soon be shipping Version 5.5 of its data management and governance toolsets. The updated editions of DatAdvantage and DataPrivilege represent the latest evolution of Varonis' Meta-data Framework, which enables customers to identify sensitive unstructured and semi-structured data on their file systems, SharePoint sites and network-attached storage (NAS) devices, find areas with excessive permissions and abnormal access activity, understand who can access, who is accessing, who shouldn't have access, and who owns the data, and remediate risk faster than traditional data protection products.

Posted June 01, 2010

At its Information On Demand conference in Rome, IBM unveiled new software to place the power of predictive analytics into the hands of business users for faster, more insightful decision making. According to the company, with three clicks, business users can now build a predictive model within a configurable Web browser interface, and run simulations and "what-if" scenarios that compare and test the best business outcomes before the model is ever deployed into an operational system. Business users now have full control over the analytic process, enabling them to make accurate decisions in real-time, based on changes in strategy, customer buying patterns and behaviors, or fluctuating market conditions.

Posted June 01, 2010

HiT Software, a provider of provider of data integration and change data capture software products, has been acquired by BackOffice Associates, a provider of data migration, data governance and master data management solutions for Oracle, SAP and other ERP vendors.

Posted May 20, 2010

Mergers and acquisitions often come quickly and when they do, it is critical to have tools and utilities capable of scaling to meet new challenges so operations continue seamlessly, customer service standards are upheld, and costs are contained. This was the case for UGI Utilities, a large natural gas and electric service provider in the eastern U.S. In 2006, UGI acquired the natural gas utility assets of PG Energy from Southern Union Company. A longtime customer of BMC, UGI found it was aligned with the right software company to provide implementation of mainframe service management solutions as well as first class support to get the job done and successfully integrate the newly acquired company's data into its environment, saving time and money.

Posted May 10, 2010

Informatica Corporation has announced new customer support offerings and proactive support capabilities as part of the Informatica Global Customer Support program. The additional services will be bundled into existing programs to further accelerate customer time-to-value, reducing cost-of-ownership and helping ensure ongoing project success.

Posted May 04, 2010

Sybase has announced the availability of Sybase PowerBuilder 12. The new release of Sybase's rapid application development tool enables developers to easily and cost effectively create or migrate their business applications on the Microsoft .NET Framework, for modern and visually appealing application user experiences.

Posted April 29, 2010

Quest Software, Inc., a Visual Studio Industry Partner and maker of Toad for Oracle, has announced the launch of Toad Extension for Visual Studio. Toad Extension for Visual Studio is a database schema provider that will support complete application lifecycle management (ALM) for Oracle in Visual Studio 2010, unifying Oracle developers with the rest of the Visual Studio development team.

Posted April 20, 2010

Talend, a provider of open source data management software, today announced the release of Talend 4.0. With this major upgrade to the company's flagship data management solutions, Talend is delivering an integrated data management platform that combines data integration, data quality and master data management (MDM) within a single solution, enabling Talend customers to further streamline and reduce the complexity of their critical data management projects.

Posted April 13, 2010

Oracle today announced the latest release of its Enterprise Performance Management (EPM) System, Release 11.1.2. As part of the 11.1.2 release, there are three new applications: two in the financial close and financial reporting area, Oracle Hyperion Financial Close Management, Oracle Hyperion Disclosure Management, and a third application that is a purpose-built planning module for public sector organizations called Oracle Hyperion Public Sector Planning and Budgeting, Hari Sankar, vice president of EPM Product Management, Oracle, tells 5 Minute Briefing. There are also significant enhancements to the existing portfolio of Oracle Hyperion Enterprise Performance Management applications such as Hyperion Planning, Hyperion Financial Management, Hyperion Data Relationship Management, Hyperion Profitability and Cost Management, as well as Oracle Essbase.

Posted April 07, 2010

Trillium Software, a provider of data quality solutions, has introduced the latest version of Global Locator, the company's one-step software solution for location intelligence and matching address information with global geocoding data. Global Locator provides a precise worldwide geocoding intelligence solution, for extremely accurate latitude/longitude information as well as address validation, cleansing and enrichment capabilities that enhance business processes and applications dependent on location data.

Posted March 30, 2010

Yesterday at the Open Source Business Conference in San Francisco, Dr. Bob Sutor, vice president, Open Source and Linux, IBM Software Group, presented a keynote address that examined key issues that users should think about when considering adoption of open source solutions.

Posted March 18, 2010

Oracle has unveiled Oracle Imaging and Process Management 11g and Oracle Forms Recognition. Both products are components of Oracle Fusion Middleware and are part of Oracle's strategic solution for Enterprise Application Documents.

Posted March 17, 2010

Progress Software Corporation yesterday announced the launch of a new business solution intended to enable enterprises to improve business performance. The solution, Progress Responsive Process Management suite (RPM), combines comprehensive visibility, business event processing and business process management (BPM) capabilities on a single, unified platform. "The reason we've created this suite is in response to what we have been seeing in our customer base. Our customers want to gain what we call ‘operational responsiveness.' In other words, to have a business that is highly adaptive, can anticipate opportunities and threats, and respond before they have been missed, or before it is too late," Dr. John Bates, Progress Software CTO, tells 5 Minute Briefing.

Posted March 16, 2010

DataFlux, a provider of data management solutions, announced a unified environment that enables data quality, data integration and master data management (MDM) to be managed from a single interface. Called the DataFlux Data Management Platform, the system is designed to help organizations to plan, build, implement and monitor data-centric projects, and extend them across the enterprise.

Posted March 09, 2010

Symantec Corp. announced today that it has prepared a data management toolset that it will be eventually be shipping as part of all its product lines over the coming year. The toolset, called Data Insight, leverages Symantec's core competencies of security and storage to enables improved governance, infer data ownership based on usage, and track data utilization.

Posted March 02, 2010

Talend, a provider of open source data integration software, has announced the availability of an open source master data management (MDM) solution. With the launch, Talend says it is "democratizing" the MDM market with an affordable, open source alternative to cost-prohibitive and disjointed proprietary technologies.

Posted February 09, 2010

IBM has signed a definitive agreement to acquire Initiate Systems, a provider of data integrity software for information sharing among healthcare and government organizations. The announcement of the acquisition "marks a significant expansion of IBM's ability to help clients integrate information from hundreds of sources," Arvind Krishna, general manager of IBM's Information Management Software division, said during a teleconference detailing the acquisition.

Posted February 08, 2010

Melissa Data, a provider of data quality and address management solutions, has announced that the latest release of Presort Object, a developer's API for postal presorting, has received USPS PAVE Gold certification for PAVE Cycle K.

Posted January 12, 2010

Oracle has acquired Silver Creek Systems, Inc., a provider of product data quality solutions. Silver Creek's DataLens product data quality solution is designed to simplify complex product descriptions commonly found in many industries, enabling enterprises to more accurately manage product data. "Lack of standardized product data continues to be a challenge for many enterprises," says Hasan Rizvi, senior vice president, Oracle Fusion Middleware Product Development. "With the addition of Silver Creek, Oracle is extending its industry leading data integration offering with complementary solutions to enhance product data quality and help customers get more accurate and consistent product data for use across their enterprise."

Posted January 06, 2010

Oracle yesterday announced that it has acquired Silver Creek Systems, Inc., a provider of product data quality solutions. Silver Creek's DataLens product data quality solution is designed to simplify complex product descriptions commonly found in many industries, enabling enterprises to more accurately manage product data.

Posted January 05, 2010

As we enter the next decade of the millennium, we will see information technology becoming more ubiquitous, driving an even greater share of business decisionmaking and operations. IT has proven its muster through the recent downturn as both a tactical and strategic weapon for streamlining, as well as maintaining competitive edge. Now, as we begin the next round of economic recovery, companies will be relying on IT even more to better understand and serve their markets and customers. Yet, there are many challenges with managing a growing array of IT hardware, software, and services. To address these requirements, businesses continue to look to approaches such as analytics, virtualization, and cloud computing. To capture the trends shaping the year ahead, Database Trends and Applications spoke to a range of industry leaders and experts.

Posted December 14, 2009

Informatica Corporation, a provider of data integration software, has announced Informatica Cloud 9, a comprehensive offering for cloud data integration.

Posted November 24, 2009

How many times have you been surfing the web only to encounter a form that requests a slew of personal information before you can continue on? You know what I'm talking about. A company markets a white paper or poll results or something else that intrigues you, so you click on the link, and bang, there you are. You don't have the information you wanted yet, but if you just fill out this form then you'll be redirected to the information.

Posted November 11, 2009

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82

Sponsors