Newsletters




Trends and Applications



Lucid Imagination, a developer of search, discovery and analytics software based on Apache Lucene and Apache Solr technology, has unveiled LucidWorks Big Data, a fully integrated development stack that combines advantages of multiple open source projects including Hadoop, Mahout, R and Lucene/Solr to provide search, machine learning, recommendation engines and analytics for structured and unstructured content in one solution available in the cloud. "With more and more companies being challenged by the explosive growth of information, as has been widely reported, the vast majority of that content is unstructured or semi structured text, and traditional business intelligence or traditional analytics methodologies don't come close to addressing the vast percentage of content," Paul Doscher, CEO of Lucid Imagination, tells DBTA.

Posted June 28, 2012

IBM stepped up its smarter computing initiative with a broad range of performance and efficiency enhancements to its storage and technical computing systems - the engines of big data. As part of its ongoing smarter computing effort, IBM has announced a new strategic approach to designing and managing storage infrastructures with greater automation and intelligence, as well as performance enhancements to several key storage systems and the Tivoli Storage Productivity Center suite. IBM also announced its first offerings that incorporate software from IBM's acquisition of Platform Computing earlier this year. "Enterprises are dealing with data that is increasing exponentially in both size and complexity," said Rod Adkins, senior vice president of IBM Systems & Technology Group. The enhanced systems and storage solutions have the performance, efficiency, and intelligence to handle this big data, he added.

Posted June 28, 2012

Data analytics vendor Teradata and information management software provider Kalido have introduced a new joint solution that they say will allow customers to build or expand a data warehouse in 90 days or less, providing deeper analytics to users for improved business decision-making. This solution combines the Teradata Data Warehouse Appliance with the Kalido Information Engine, providing customers with a streamlined data consolidation tool that aggregates disparate data into a single unified platform.

Posted June 28, 2012

Companies are scrambling to learn all the various ways they can slice, dice, and mine big data coming in from across the enterprise and across the web. But with the rise of big data — hundreds of terabytes or petabytes of data — comes the challenge of where and how all of this information will be stored. For many organizations, current storage systems — disks, tapes, virtual tapes, clouds, inmemory systems — are not ready for the onslaught, industry experts say. There are new methodologies and technologies coming on the scene that may help address this challenge. But one thing is certain: Whether organizations manage their data in their internal data centers, or in the cloud, a lot more storage is going to be needed. As Jared Rosoff, director of customer engagement with 10gen, puts it: "Big data means we need ‘big storage.'"

Posted June 13, 2012

As data continues to grow unabated, organizations are struggling to manage it more efficiently. By better leveraging their expanding data stores and making the information available more widely, organizations hope to put big data to work — helping them to achieve greater productivity and more informed decision making, as well as compete more effectively as a result of insights uncovered by analytics on their treasure troves of information. Improving the management of big data is not something to consider addressing at some point in the hazy future — the big data challenge is already here, according to a new survey of 264 data managers and professionals who are subscribers to Database Trends and Applications.

Posted June 13, 2012

Social media network-based business intelligence represents the next great frontier of data management, promising decision makers vast vistas of new knowledge gleaned from exabytes of data generated by customers, employees, and business partners. Mining data from Facebook, Twitter, blogs, wikis, and internal corporate networks potentially may surface new insights into impending market shifts, patterns in customer sentiment, and competitive intelligence. It's a rich opportunity not lost on today's organizations, a new survey of 711 business and IT managers from across the globe reveals. A majority of respondents are either planning to collect and analyze data from both proprietary and public social media networks, or are doing so already.

Posted June 13, 2012

Informix Genero, a new IBM offering developed under partnership with Four Js Development Tools, is a logical enhancement to the Informix 4GL language and environment that offers extensive capabilities for developing modern web and desktop GUI applications, reports, and web services. With IBM Informix Genero, users can recompile 4GL applications and run them as GUI and web applications while retaining the business logic.

Posted May 23, 2012

Eagle Creek Software Services - CRM and BI Market Leader Profile

Posted May 23, 2012

As a leader in pharmacy technology, National Health Systems, Inc. provides a range of services for the retail pharmacy industry. Built on a foundation of dedication and commitment to its customers and the profession of Pharmacy, NHS companies PDX, NHIN, and Rx.com provide pharmacies with the tools they need to provide the best possible patient care, manage their businesses, and enhance their competitiveness in the marketplace.

Posted May 23, 2012

Google has announced that Google BigQuery, a web service that lets users do interactive analysis of massive data sets, is now available to the public. Billed as enabling customers to "analyze terabytes of data with just a click of a button," the company says the data is secured, replicated across data centers, and can be easily exported.

Posted May 23, 2012

Informatica Corporation, a provider of data integration software and services, has announced the latest release of its Informatica Cloud solution. Offered as an integration platform-as-a-service (iPaaS), the latest release from Informatica features the Cloud Connector Toolkit, Cloud Integration Templates, and new enterprise features, all of which are part of the new Informatica Cloud Developer Edition and allow developers to rapidly embed end-user customizable integration logic and connectivity into cloud applications.

Posted May 23, 2012

Big data has become a big topic for 2012. It's not only the size, but also the complexity of formats and speed of delivery of data that is starting to exceed the capabilities of traditional data management technologies, requiring the use of new or exotic technologies simply to manage the volume alone. In recent years, the democratization of analytics and business intelligence (BI) solutions has become a major driving force for data warehousing, resulting in the use of self-service data marts. One major implication of big data is that in the future, users will not be able to put all useful information into a single data warehouse. Logical data warehouses bringing together information from multiple sources as needed is replacing the single data warehouse model. Combined with the fact that enterprise IT departments are continually moving towards distributed computing environments, the need for IT process automation to automate and execute the integration and movement of data between these disparate sources is more important than ever.

Posted May 09, 2012

There has been a significant change in the IT world recently; solution developers no longer believe the answer to all data management challenges is a relational database. After 40 years, data management was considered to be a quiet part of IT where the products and providers were firmly decided. It is evident that information management has again become quite dynamic with a broad set of solutions offering new options for managing the big data challenges of today.

Posted May 09, 2012

The term "big data" refers to the massive amounts of data being generated on a daily basis by businesses and consumers alike - data which cannot be processed using conventional data analysis tools owing to its sheer size and, in many case, its unstructured nature. Convinced that such data hold the key to improved productivity and profitability, enterprise planners are searching for tools capable of processing big data, and information technology providers are scrambling to develop solutions to accommodate new big data market opportunities.

Posted May 09, 2012

CIOs and IT departments are on the frontlines of a monumental IT shift. With the number of mobile devices and applications exploding and bandwidth soaring, they are being asked to find ways to enable the brave new world of enterprise mobility. All involved - from users to IT - recognize the productivity and business efficiency benefits of this trend, but it is typically only IT that also recognizes the dangers unchecked mobility poses to sensitive corporate data.

Posted May 09, 2012

Learning from the European Union Data Directives on Privacy

Posted April 26, 2012

Oracle addressed the need to make IT infrastructure and business analytics technologies simpler and more efficient in a presentation to OpenWorld Tokyo 2012 attendees that was also made available via live webcast. In addition to presenting its strategy and plans for business analytics, the company also unveiled new additions to its product portfolio. In his keynote address, Oracle president Mark Hurd explained how the business users of tomorrow will require faster and more comprehensive information access. "The true question with analytics is how to get the right information to the right person at the right time to make the right decision," he said.

Posted April 26, 2012

IBM has introduced DB2 10 and InfoSphere Warehouse 10 software that integrates with big data systems, automatically compresses data into tighter spaces to prevent storage sprawl, and slices information from the past, present, and future to eliminate expensive application code. Over the past 4 years, more than 100 clients, 200 business partners, and hundreds of experts from IBM Research and Software Development Labs around the world collaborated to develop the new software.

Posted April 26, 2012

MapR Technologies, Inc., provider of the MapR distribution for Apache Hadoop, has introduced new data connection options for Hadoop to enable a range of data ingress and egress alternatives for customers. These include direct file-based access using standard tools and file-based applications, direct database connectivity, Hadoop specific connectors via Sqoop, Flume and Hive; as well as direct access to popular data warehouses and applications using custom connectors. Additionally, technology providers Pentaho and Talend are partnering with MapR to provide direct integration with MapR's distribution, and MapR has also entered into a partnership with data warehouse and business intelligence platform vendor Tableau Software.

Posted April 26, 2012

Helping customers manage the influx of mobile devices, networks and applications in the enterprise, SAP has unveiled a new release of the Afaria mobile device management solution. With the 7.0 release of Afaria, SAP aims to allow enterprise IT to more effectively manage mobile applications and devices through a new user interface (UI) for simplified administration, improved workflow and enterprise integration capabilities. "The consumerization of IT is driving our innovation path and commitment to providing customers with the industry's most comprehensive, robust and streamlined mobility management platform, including mobile device management," says Sanjay Poonen, president, Global Solutions, SAP.

Posted April 26, 2012

Big News with Big Data - Commercial Enhancements to Apache Hadoop Usher in a New Era

Posted April 11, 2012

Seven Steps for a Successful Applications Rationalization Initiative

Posted April 11, 2012

The Next Information Processing Super Engine

Posted April 11, 2012

There's no question that cloud computing is a hot commodity these days. Companies of all types and sizes are embracing cloud computing-both internally and from external service providers - as a way to cost-effectively build new capabilities. With the rapid growth of cloud comes new questions about responsibility within organizations, in terms of how services will be paid for, who has ultimate say over cloud decisions, and how cloud fits into the overall strategic direction of the business.

Posted March 28, 2012

There's no question that cloud computing is a hot commodity these days. Companies of all types and sizes are embracing cloud computing-both internally and from external service providers - as a way to cost-effectively build new capabilities. With the rapid growth of cloud comes new questions about responsibility within organizations, in terms of how services will be paid for, who has ultimate say over cloud decisions, and how cloud fits into the overall strategic direction of the business.

Posted March 21, 2012

Social media business intelligence is on the rise, according to a new Unisphere Research study sponsored by IBM and Marist College. The study found that while social media monitoring and analysis is in its early stages, many organizations plan to monitor, collect, stage and analyze this data over the next 1 to 5 years and. In particular, LOB respondents, who are closer to customers, show appreciation for the benefits of monitoring SMNs.

Posted March 21, 2012

MarkLogic Corporation has joined the technology partner program of Hortonworks, a leading vendor promoting the development and support of Apache Hadoop. According to the vendors, by leveraging MarkLogic and Hortonworks, organizations will be able to seamlessly combine the power of MapReduce with MarkLogic's real-time, interactive analysis and indexing on a single, unified platform. There are two main reasons that MarkLogic has chosen to partner with Hortonworks, says Justin Makeig, senior product manager at MarkLogic. One is Hortenworks' extensive experience with Hadoop installations and the second is that its core product is 100% open source.

Posted March 21, 2012

Pentaho Corporation, an open source business analytics company, has formed a strategic partnership with DataStax, a provider of big data solutions built upon the Apache Cassandra project, a high performance NoSQL database. The relationship will provide native integration between Pentaho Kettle and Apache Cassandra. This will merge the scalable, low-latency performance of Cassandra with Kettle's visual interface for high-performance ETL, as well as integrated reporting, visualization and interactive analysis capabilities. According to the companies, organizations seeking to leverage their big data have found it difficult to implement and employ analytics technologies. "One of the big challenges today is ease of use of these tools," says Ian Fyfe, Pentaho's chief technology evangelist. Often built on open source projects, it "takes a lot of deep skills to use these systems, and these are skills that are hard to find," he explains.

Posted March 21, 2012

Novell announced an update to its ZENworks suite, which includes integrated Mac device management, and full disk encryption capabilities. ZENworks 11 Support Pack 2 enables customers to lock out threats without shutting down IT access, the vendor says. ZENworks 11 now offers a more holistic approach to supporting Mac devices in the enterprise. With this release, Mac support is provided through Remote Management for Mac, Asset Management for Mac, Mac OSX Patching and Mac Bundles.

Posted March 21, 2012

Improving Data Protection with Deduplication

Posted March 07, 2012

Organizational focus has been placed on the emergence of "big data" - large-scale data sets that businesses and governments use to create new value with today's computing and communications power. Big data poses many opportunities, but managing the rapid growth adds challenges, including complexity and cost. Leaders must address the implications of big data, increasing volume and detail of information captured by enterprises, the rise of multimedia, social media, and the internet.

Posted March 07, 2012

For enterprises grappling with the onslaught of big data, a new platform has emerged from the open source world that promises to provide a cost-effective way to store and process petabytes and petabytes worth of information. Hadoop, an Apache project, is already being eagerly embraced by data managers and technologists as a way to manage and analyze mountains of data streaming in from websites and devices. Running data such as weblogs through traditional platforms such as data warehouses or standard analytical toolsets often cannot be cost-justified, as these solutions tend to have high overhead costs. However, organizations are beginning to recognize that such information ultimately can be of tremendous value to the business. Hadoop packages up such data and makes it digestible.

Posted March 07, 2012

Clinical Data Management (CDM) is a company headquartered in Colorado that provides clinical information database software, enabling medical institutions to report and compile data on patient care. A longtime user of Revelation Software, dating back to Revelation G and continuing through OpenInsight, CDM was pleased with both the quality of the products and the service from Revelation. However, CDM had come to realize it needed to provide a web interface for data entry to better support its customers and also stay current with evolving technology requirements. That need was answered when Revelation launched the OpenInsight for Web (O4W) Development Toolkit, a web development toolkit that makes it possible for OpenInsight developers with limited or no HTML, XML or JavaScript experience to develop feature-rich web pages.

Posted March 07, 2012

Valuable data and trusted applications are dependent on MultiValue databases at many organizations, but there is also a need to integrate that data and those applications with other systems and provide access to users in new ways. In this special section, DBTA asks leading MultiValue vendors:What is your organization doing to help customers modernize and stay current with new technologies to address the evolving requirements of customers?

Posted March 07, 2012

SAP application performance (speed and availability) is becoming a major focus as com­panies rely increasingly on their SAP systems to support employee productivity, partner collaboration, customer relationships, revenues, brand equity and growth. With many companies running their critical business processes on SAP, high availability and acceptable speed of the business software environment are essential requirements.

Posted February 23, 2012

In celebration of ODBC's 20th anniversary this year, Progress Software Corporation has unveiled its Platinum ODBC drivers, Progress DataDirect Connect for ODBC 7.0. The standards-based, fully interoperable Progress DataDirect Connect for ODBC 7.0 driver allows application developers to reliably exchange data between cloud data and disparate data sources.

Posted February 23, 2012

Oracle has announced the availability of Oracle Advanced Analytics, a new option for Oracle Database 11g that combines Oracle R Enterprise with Oracle Data Mining. According to Oracle, Oracle R Enterprise delivers enterprise class performance for users of the R statistical programming language, increasing the scale of data that can be analyzed by orders of magnitude using Oracle Database 11g.

Posted February 23, 2012

The challenges of maintaining security and regulatory compliance as applications increasingly move to the cloud - whether public, private or hybrid - will come into greater focus in 2012, says Ryan Berg, cloud security strategy lead for IBM. The need to manage security among an increasingly mobile workforce, with many employees choosing to use their own personal devices, will also be a key concern in 2012, says Berg.

Posted February 23, 2012

Kalido, a provider of agile information management software, unveiled the latest release of the Kalido Information Engine, which helps organizations decrease the time for data mart migrations and consolidations. With this new release, customers will be able to import existing logical and physical models and taxonomies to build a more agile data warehouse. Enabling customers to take advantage of existing assets and investments "is going to dramatically reduce the time and the cost that it takes to bring together data marts into more of a data warehouse scenario," says John Evans, director of product marketing at Kalido.

Posted February 23, 2012

The Advantages of Using Structured Data for E-Discovery

Posted February 09, 2012

Today's organizations must capture, track, analyze and store more information than ever before - everything from mass quantities of transactional, online and mobile data, to growing amounts of "machine-generated data" such as call detail records, gaming data or sensor readings. And just as volumes are expanding into the tens of terabytes, and even the petabyte range and beyond, IT departments are facing increasing demands for real-time analytics. In this era of "big data," the challenges are as varied as the solutions available to address them. How can businesses store all their data? How can they mitigate the impact of data overload on application performance, speed and reliability? How can they manage and analyze large data sets both efficiently and cost effectively?

Posted February 09, 2012

Businesses are struggling to cope with and leverage an explosion of complex and connected data. This need is driving many companies to adopt scalable, high performance NoSQL databases - a new breed of database solutions - in order to expand and enhance their data management strategies. Traditional "relational" databases will not be able to keep pace with "big data" demands as they were not designed to manage the types of relationships that are so essential in today's applications.

Posted January 25, 2012

Tableau Software, a provider of business intelligence software, has announced the general availability of Tableau 7.0. The release offers improvements in performance and scalability, adds new visualization types and improves the product's overall analytical power and ease-of-use. In addition, the new Tableau Data Server capabilities will make it easy to share large volumes of data, share data models in large groups, and provide enhanced management of data assets, says Chris Stolte, chief development officer, co-founder, and inventor of Tableau Software.

Posted January 25, 2012

RainStor, a provider of big data management software, has unveiled the RainStor Big Data Analytics on Hadoop, which the company describes as the first enterprise database running natively on Hadoop. It is intended to enable faster analytics on multi-structured data without the need to move data out of the Hadoop Distributed File System (HDFS) environment. There is architectural compatibility with the way Rainstor manages data and the way Hadoop Distributed File Systems manage CSV files, says Deirdre Mahon, vice president of marketing at Rainstor.

Posted January 25, 2012

The Oracle Big Data Appliance, an engineered system of hardware and software that was first unveiled at Oracle OpenWorld in October, is now generally available. The new system incorporates Cloudera's Distribution Including Apache Hadoop (CDH3) with Cloudera Manager 3.7, plus an open source distribution of R. The Oracle Big Data Appliance represents "two industry leaders coming together to wrap their arms around all things big data," says Cloudera COO Kirk Dunn.

Posted January 25, 2012

"Big data" and analytics have become the rage within the executive suite. The promise is immense - harness all the available information within the enterprise, regardless of data model or source, and mine it for insights that can't be seen any other way. In short, senior managers become more effective at business planning, spotting emerging trends and opportunities and anticipating crises because they have the means to see both the metaphorical trees and the forest at the same time. However, big data technologies don't come without a cost.

Posted January 11, 2012

CIOs and IT departments are on the frontlines of a monumental IT shift. With the number of mobile devices and applications exploding and bandwidth soaring, they are being asked to find ways to enable the brave new world of enterprise mobility. All involved - from users to IT - recognize the productivity and business efficiency benefits of this trend, but it is typically only IT that also recognizes the dangers unchecked mobility poses to sensitive corporate data.

Posted January 11, 2012

The argument that "everyone is doing it and you should too" holds no value for strategic decision-making in IT. Yet critical thinking often goes by the wayside when a hot, new trend catches on and it seems like the masses are following along. Cloud computing is certainly in vogue - most industry analysts are bullish on cloud computing adoption and anticipate enterprise spending to increase - but organizations need to steer clear of falling into the trap that moving to the cloud always delivers cost savings.

Posted January 11, 2012

December 2011 E-Edition UPDATE

Posted December 16, 2011

Stacks of statistics from many sources share a common theme - growth rates for digital information are extremely high and undeniable. A tsunami of e-information is fueling the engine of today's corporate enterprise, and many businesses are aiming to ride the information wave to prosperity. However, many companies are not sufficiently attentive to all the potential liabilities lurking in the depths of this digital information, including the risks involved in using real, live personal customer and employee data for application development and testing purposes. There's real potential for serious data security, legal and noncompliance risks when businesses fail to protect this data.

Posted December 01, 2011

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31

Sponsors