Newsletters




Trends and Applications



Until recently, companies were only warming up to the possibilities of cloud computing. Lately, however, for many enterprise-IT decision makers, cloud is hot, hot, hot. The sea change now underway means many companies are quickly moving from "dipping their toes into cloud computing" to a full-fledged immersion, says Thom VanHorn, vice president of marketing for Application Security, Inc. In 2012, expect to see those same companies dive right in. "The move will only accelerate," he tells DBTA.

Posted December 01, 2011

A new survey of 421 data managers and professionals affiliated with the Independent Oracle Users Group (IOUG) members finds that while most companies have well-established data warehouse systems, adoption is still limited within their organizations. Many respondents report a significant surge of data within their data warehouses in recent times, fueled not only by growing volumes of transaction data but unstructured data as well. Now, the challenge is to find ways to extend data analysis capabilities to additional business areas.

Posted December 01, 2011

Efficient Vehicle Tracking System Software Solution with Informix Dynamic Server

Posted November 10, 2011

Few things in the world are changing as dramatically as data. Data has tapped out a powerful rhythm to keep time with technology's bleeding edge, leaving many technologies struggling to keep up. It should come as no surprise, then, that many of the data strategies that IT departments developed, and still widely rely upon, are no longer sufficient for today's needs. You can put data marts near the top of that list.

Posted November 10, 2011

Customer centricity has become a watchword for major corporations worldwide, but a recently released survey has revealed that many enterprises are lacking in the basic knowledge of who their customers are, not to mention their attributes, tastes, purchasing histories and relationships with other customers.

Posted October 26, 2011

Columnar database technology burst on the data warehouse scene just a couple years ago with promises of faster query speeds on vast amounts of data. They delivered on that promise, but at a cost that is no longer worth paying. Here's why.

Posted October 26, 2011

Ambitious Plans for 2012 Laid Out by User Group Presidents at OpenWorld

Posted October 26, 2011

Teradata, a data analytic data solutions provider, announced the latest update to its flagship data warehouse product, as well as new features in its data warehouse appliance. Teradata Database 14 is designed as the analytical engine for powering all of the vendor's "purpose-built" platform family members - from enterprise data warehouses to appliances. "We're including big data application support," says Scott Gnau, president of Teradata Labs at a briefing at the vendor's recent user group event.

Posted October 26, 2011

DataStax, a provider of solutions based on the open source Apache Cassandra database platform, announced it is shipping an enterprise database platform designed to enable the management of both real-time and analytic workloads from a single environment. The new platform, DataStax Enterprise, is designed to leverage the features of Cassandra to provide performance enhancements and cost savings over traditional database management solutions, the vendor claims.

Posted October 26, 2011

Legacy IT systems were developed for - and still run - about 60% of all mission-critical applications. They are still stable and reliable. However, the maintenance on legacy applications over time has made them so complex that rather than helping to solve business problems, most legacy systems have become a challenge that enterprises are grappling with.

Posted October 15, 2011

Cloud computing has taken the enterprise IT world by force. IT managers and CIOs are evaluating private, public and hybrid cloud infrastructures for running corporate applications and services. Many are doing pilots and evaluating large-scale migrations to the cloud, with the hope of not only saving money but increasing services for users.

Posted October 15, 2011

Joe Clabby authors a comprehensive overview on the SHARE website this week focusing on the issue of the IT Skills Gap. While the need for mainframe professionals remains high, for instance, the supply of young mainframers remains stubbornly short. Certainly, the supply of mainframers coming out of universities is hamstrung by a number of specific factors including an uninformed perception by students about the platform's ongoing importance, a dearth of curriculum and faculty covering the subject in many computer science programs, and a lack of outreach by industry into local computer science departments to foster both academic and internship focus.

Posted October 15, 2011

During a keynote presentation last week at Oracle OpenWorld 2011, the new Oracle Big Data Appliance, an engineered system optimized for acquiring, organizing and loading unstructured data into Oracle Database 11g, was announced by Thomas Kurian, executive vice president, Product Development, Oracle.

Posted October 15, 2011

Smartphones, tablets and other handhelds are changing the way companies do business. And when these revolutionary devices can be combined with existing tried-and-true software for evolutionary change, as opposed to ripping and replacing, the results are even better.

Posted September 14, 2011

As companies learn to embrace "big data" - terabytes and gigabytes of bits and bytes, strung across constellations of databases - they face a new challenge: making the data valuable to the business. To accomplish this, data needs to be brought together to give decision makers a more accurate view of the business. "Data is dirty and it's hard work; it requires real skills to understand data semantics and the different types of approaches required for different data problems," Lawrence Fitzpatrick, president of Computech, Inc., tells DBTA. "It's too easy to see data as ‘one thing.' "

Posted September 14, 2011

As data grows, the reflex reaction within many organizations is to buy and install more disk storage. Smart approaches are on the horizon but still only prevalent among a minority of companies. How is it data has grown so far so fast? Technology growth along the lines of Moore's Law (doubling every 18 months) has made petabyte-capable hardware and software a reality. And data growth itself appears to be keeping pace with the hardware and systems. In fact, a petabyte's worth of data is almost commonplace, as shown in a new survey conducted by Unisphere Research among members of the Independent Oracle Users Group (IOUG). In "The Petabyte Challenge: 2011 IOUG Database Growth Survey," close to 1 out of 10 respondents report that the total amount of online (disk-resident) data they manage today-taking into account all clones, snapshots, replicas and backups-now tops a petabyte.

Posted September 14, 2011

Virtualization is such a broad term and a hot topic among IT professionals. However, just because your organization has conquered server virtualization, or is well underway with confidence, if you proceed with the same desktop virtualization practices, you will be setting yourself up for failure.

Posted August 29, 2011

Getting Out of Storage Debt

Posted August 29, 2011

Informatica Corporation has announced the availability of what the company describes as the industry's first dynamic data masking (DDM) solution. Informatica Dynamic Data Masking provides real-time, policy-driven obfuscation of sensitive data to address a wide range of common data security and privacy challenges without requiring any changes to database or application source code and is intended to address problems that cannot be solved by other technologies such as IAM (identity access management), SDM (static data masking). Informatica Dynamic Data Masking is based on technology developed by ActiveBase, which was acquired by Informatica in July, 2011.

Posted August 29, 2011

Endeca Technologies, Inc., an information management software company, has unveiled native integration of Endeca Latitude with Apache Hadoop. "This native integration between Endeca Latitude and Hadoop brings together big data processing from Hadoop and interactive search, exploration and analysis from Latitude," says Paul Sonderegger, chief strategist of Endeca Technologies.

Posted August 29, 2011

Another IT initiative is in the news. What does it really mean for you? Is it an opportunity? Or is it a distraction? Whatever your perspective, it seems clear that internet computing standards have reached another plateau of standardization and capability, such that vendors see an opportunity to pursue new models of computing.

Posted August 11, 2011

The rise of big data has garnered much of the attention in the data management arena lately. But it is not simply the sheer volume of data that is challenging data professionals. Many new types and brands of DBMSs are also popping up across organizations, bringing new problems for the data professionals who are tasked with managing them, and also giving rise to scores of "accidental database administrators" with no formal DBA training, a new Unisphere Research study reveals.

Posted August 11, 2011

Over the years, countless papers and articles have been written on enterprise resource planning (ERP) implementation project success rates and why ERP projects fail. Interestingly enough, the reasons that projects fail are the same today as they were 10 years ago: lack of top management commitment, unrealistic expectations, poor requirements definition, improper package selection, gaps between software and business requirements, inadequate resources, underestimating time and cost, poor project management, lack of methodology, underestimating impact of change, lack of training and education, and last, but not least, poor communication.

Posted August 11, 2011

There is no doubt that virtualization is radically changing the shape of IT infrastructure, transforming the way applications are deployed and services delivered. Databases are among the last of the tier 1 applications to be hosted on virtual servers, but the past year has seen a huge wave of increase for production Oracle, SQL Server and other databases on VMware platforms. For all the benefits of virtualization, including cost-effectiveness, there are some impacts on the IT staff involved. Unfortunately for the DBAs virtualization often means losing control and visibility of their systems, which can ultimately hinder their ability to deliver database-oriented business solutions. While in the past DBAs had perfect visibility to the physical servers hosting the databases, the virtualization layers and the tools to manage them are typically out of bounds to them. While all the excitement of late has centered on VMware and other virtual machine systems, the DBAs have a valid reason for skepticism.

Posted July 27, 2011

Given all of the recent discussion around big data, NoSQL and NewSQL, this is a good opportunity to visit a topic I believe will be (or should be) forefront in our minds for the next several years - high velocity transactional systems. Let's start with a description of the problem. High velocity transactional applications have input streams that can reach millions of database operations per second under load. To complicate the problem, many of these systems simply cannot tolerate data inconsistencies.

Posted July 27, 2011

Oracle has introduced the Oracle Exadata Storage Expansion Rack to offer customers a cost-effective way to add storage to an Oracle Exadata Database Machine. "There are customers, earlier Exadata Database Machine customers, that have now started to fill up the disks that they have on the Database Machine and they are starting to look for ways to expand their storage capacity, and so this is going to be really welcome for them," says Tim Shetler, vice president of Product Management, Oracle.

Posted July 27, 2011

Sybase, an SAP company, has announced the general availability of Sybase IQ 15.3, which aims to help enterprise IT departments overcome the scalability limitations of many data warehouse approaches. By implementing a business analytics information platform that allows sharing of computing and data resources through the new Sybase IQ PlexQ technology, the company says enterprises can break down user and information silos to increase analytics adoption throughout their entire organization. There is a lot of talk about big data, but how to manage it and analyze it is only half the problem, observes David Jonker, senior product marketing manager of Sybase IQ. "The other half is how do you make it more pervasive throughout the enterprise and from our perspective that is where a lot of the existing data warehousing solutions fall down."

Posted July 27, 2011

The big data playing field grew larger with the formation of Hortonworks and HPCC Systems. Hortonworks is a new company consisting of key architects and core contributors to the Apache Hadoop technology pioneered by Yahoo. In addition, HPCC Systems, which has been launched by LexisNexis Risk Solutions, aims to offer a high performance computing cluster technology as an alternative to Hadoop.

Posted July 27, 2011

Time series data is a sequence of data points typically measured at successive times and may be spaced at uniform time intervals. Time-stamped data can be analyzed to extract meaningful statistics or other characteristics of the data. It can also be used to forecast future events based on known past events. Time series data enables applications such as economic forecasting, census analysis and forecasting, fleet management, stock market analysis, and smart energy metering. Because it is time-stamped, time series data has a special internal structure that differs from relational data. Additionally, many applications such as smart metering store data at frequent intervals that require massive storage capacity. For these reasons, it is not sufficient to manage time series information using the traditional relational approach of storing one row for each time series entry.

Posted July 07, 2011

Today, we operate in a global economy at internet speed. Globalization of our workforce has shifted the way work gets done. The explosion of wireless and edge technology has raised the expectations of consumers, who are more informed, educated, and knowledgeable about products and services. This changing landscape places immense pressure on business applications in organizations worldwide. Critical application outages caused by software defects can cost the business millions of dollars in revenue for every hour of downtime.

Posted July 07, 2011

As the economy shifts to expansion mode, and businesses start hiring again, a familiar challenge is rearing its head. Companies are scrambling to find the talent needed to effectively run, maintain, and expand their technology platforms. This is not a new problem by any means, but this time around, it is taking on a greater urgency, as just about every organization relies on information technology to be competitive and responsive to growth opportunities. A new survey of 376 employers finds a majority depend on the educational sector - universities and colleges - to provide key IT skills, often in conjunction with their own internal training efforts. However, few of the executives and managers hiring out of colleges are entirely satisfied with the readiness of graduates.

Posted July 07, 2011

Representing a continued expansion of its big data analytics portfolio, IBM has introduced a new addition to the Netezza product family of analytics appliances that is designed to help organizations uncover patterns and trends from extremely large data sets. The appliance is the first to be delivered by IBM since it acquired Netezza in November 2010. According to IBM, using the new appliance, businesses can now more easily sift through petabytes of data, including banking and mobile phone transactions, insurance claims, electronic medical records, and sales information, and they can also analyze this information to reveal new trends on consumer sentiment, product safety, and sales and marketing effectiveness. "This new appliance takes the scalability to a completely new dimension," says Razi Raziuddin, senior director of product management at IBM Netezza.

Posted June 24, 2011

BMC Software has acquired the portfolio of IMS (Information Management System) database products and customers from NEON Enterprise Software, a mainframe management software company. According to BMC, adding NEON Enterprise Software's IMS products to its existing offerings will satisfy the critical need organizations have for industry-leading, high-performance solutions that not only help manage, optimize and support IMS environments, but also reduce operating costs and improve business service delivery. All told, BMC is acquiring more than 20 products through the acquisition, says Robin Reddick, director of MSM Solutions Marketing at BMC Software.

Posted June 24, 2011

EMC Corporation, a provider of storage and infrastructure solutions, announced it will be shipping a data warehouse appliance that leverages the Apache Hadoop open-source software used for data-intensive distributed applications. The company's high-performance, data co-processing Hadoop appliance - the Greenplum HD Data Computing Appliance - integrates Hadoop with the EMC Greenplum Database, allowing the co-processing of both structured and unstructured data within a single solution. EMC also says the solution will run either Hadoop-based EMC Greenplum HD Community Edition or EMC Greenplum HD Enterprise Edition software.

Posted June 24, 2011

Oracle has announced the availability of Oracle JDeveloper 11g Release 2. Part of Oracle Fusion Middleware 11g, Oracle JDeveloper is a free, full-featured IDE. "It's a very broad and very productive environment targeted toward Oracle developers in the Java environment," says Bill Pataky, vice president of product management, Oracle, tells 5 Minute Briefing. The new release enhances the overall development experience by delivering an improved IDE, including support for new technologies and standards, as well as updates to Oracle Application Development Framework (ADF) 11g. "Java developers will immediately notice the difference."

Posted June 24, 2011

Composite Software has introduced Composite 6, a new version of its flagship data virtualization software that provides "big data" integration support for the Cloudera Distribution including Apache Hadoop (CDH), IBM Netezza and HP Vertica data sources. In addition, Composite 6, which is now completing beta test and will be commercially available in July, includes performance optimizations, cache enhancements, new data governance capabilities and ease-of-use features. "Data virtualization is emerging as an ideal solution for managing today's complex data integration challenges," says Jim Green, CEO for Composite Software.

Posted June 24, 2011

HP has unveiled a new suite of software which it says is designed to rationalize, measure and improve IT performance called the HP IT Performance Suite. The suite provides CIOs insight from across a comprehensive range of solutions to manage and optimize application development, infrastructure and operations management, security, information management, and financial planning and administration. Each product in the HP Software portfolio improves the performance of the discrete IT functions addressed, while a new IT Executive Scorecard helps technology executives optimize overall IT investments and outcomes.

Posted June 24, 2011

When you think of mission-critical services, perhaps none is as critical as electrical service. Not much can happen in modern businesses, government offices, or even homes without it. Central Vermont Public Service (CVPS) is the largest electric company in Vermont. More than 159,000 customers in 163 communities rely on the electrical service CVPS provides. And, according to J.D. Power and Associates, a global marketing and surveying company, for overall customer satisfaction, CVPS continues to rank in the top tier of utilities in the eastern region, more than 50 points above the regional average.

Posted June 08, 2011

Is the day of reckoning for big data upon us? To many observers, the growth in data is nothing short of incomprehensible. Data is streaming into, out of, and through enterprises from a dizzying array of sources-transactions, remote devices, partner sites, websites, and nonstop user-generated content. Not only are the data stores resulting from this information driving databases to scale into the terabyte and petabyte range, but they occur in an unfathomable range of formats as well, from traditional structured, relational data to message documents, graphics, videos, and audio files.

Posted June 08, 2011

Jaded IT professionals and managers, as well as market analysts, weary and wary from decades of overblown analyst claims about emerging new technologies, "paradigm shifts" and "enchanted quadrants," will take heart in a new series of Unisphere Research studies being released over the next several months. The first of these, "The Post-Relational Reality Sets In: 2011 Survey on Unstructured Data," has just been released, and tackles the current dimensions and impact of unstructured data on enterprise IT practices, technologies, policies, purchasing priorities and the evaluation of new technologies.

Posted June 08, 2011

The service management world of today is all about linking business services to the underlying IT infrastructure, creating an effective bridge between the business and technology. In theory, this provides a clear window into the IT environment to increase accountability, productivity and efficiency. Effective service management also provides business context, so IT can take action to avert service-impacting events by understanding business priority. However, current business service management (BSM) does not provide enough guidance about how to manage services proactively and effectively. This issue is now more important than ever, because on the horizon lurks an exciting new arena for service management-virtualization and cloud computing.

Posted May 25, 2011

The Oracle Exadata Database Machine X2-2 and X2-8 with the Solaris option began shipping just this month. Now in its third generation, the Database Machine combines all the components to create what the company describes as the best platform for running the Oracle Database. Here, Tim Shetler, vice president of Product Management, Oracle, talks about the performance innovations that differentiate Oracle's offering, how customers are using the system today for business advantage, and also — what's ahead.

Posted May 25, 2011

SnapLogic, a provider of application integration software, has introduced a solution aimed at enabling easy connection and reliable large data integration between business applications, cloud services, social media and Hadoop. The product, called SnapReduce, transforms SnapLogic data integration pipelines directly into MapReduce tasks, making Hadoop processing more accessible and resulting in optimal Hadoop cluster utilization. "This is Hadoop for humans," says Gaurav Dhillon, CEO of SnapLogic.

Posted May 25, 2011

SAP AG and Sybase, Inc., an SAP company, have announced plans to make the enterprise resource planning (ERP) application SAP ERP the first SAP Business Suite application running on Sybase Adaptive Server Enterprise (ASE). The announcement was made at SAPPHIRE NOW in Orlando where pilot customers also showcased how they are using SAP ERP on Sybase ASE. In combining SAP applications with Sybase technology, along with "harmonized" customer services and support, the companies say they will be able to offer organizations a new database option for running SAP applications and accessing critical information, providing efficiency gains and cost reductions.

Posted May 25, 2011

Informatica Corporation has announced Informatica Cloud Summer 2011, a major new release of its cloud integration service. The Informatica Cloud Summer 2011 release enables universal cloud integration and unified hybrid deployment for both on-premise and cloud deployments. The new release provides ease of use cloud features to enhance the simplicity of learning, deploying, administering, managing and configuring cloud integration, as well as enterprise-class functionality, including fine-grained access controls and delegated administration.

Posted May 25, 2011

Pervasive Software Inc., a provider of data integration solutions, has launched an online marketplace and community that is intended to fill a market void by providing simplification, ease of access and a public marketplace for data integration products, solutions, connectors, plug-ins and templates. Pervasive Galaxy is intended to serve as a community platform for data integration ecosystems to enable simple, profitable convergence between business-to-business integration producers and consumers through faster market and social connection. "It's Bazaar Voice, iTunes, and App Store, all rolled into one mashup," Ron Halversen, director of integration marketing, tells 5 Minute Briefing. "Galaxy is an app exchange for connectors."

Posted May 25, 2011

Big data provides new opportunities to improve customer care, unearth business insights, control operational costs, and in some cases, enable entirely new business models. By having access to larger and broader data sets, you can improve forecasts and projections for the business. A healthcare organization can conduct longitudinal analysis against years of data for patients treated with coronary attacks in order to improve care and speed time to recovery. A retailer can conduct deeper analysis on buying behavior during recessionary times if they have access to large data sets collected during the last economic downturn. Additionally, organizations across many sectors, such as communications, financial services and utilities, face significant regulatory and legal requirements for retaining and providing fast access to historical data for inquiries, audits and reporting.

Posted May 12, 2011

North American businesses are collectively losing $26.5 billion in revenue each year as a result of slow recovery from IT system downtime, according to a recent study. To protect against unexpected outages, IT organizations attempt to prepare by creating redundant backup systems, duplicating every layer in their existing infrastructure and preparing elaborate disaster recovery processes. This approach is expensive and only partly effective, as demonstrated by the string of notable outages, and can be seen, at best, as a way to minimize downtime. Major social networking companies, such as Google and Facebook, have figured out how to scale ut application stacks rather than scale up vertically.

Posted May 12, 2011

Through the many changes in IT over the years, one constant has always been a concern for performance. With database systems there is especially true. Even with the many advances in relational database technology, SQL performance still remains a key concern for IT professionals and management. Writing SQL for performance is one of the single biggest opportunities for professionals to contribute efficient, effective, cost saving deliverables to projects. Writing SQL for performance can also avoid having to respond to an urgent problem with performance in a production environment. To a considerable extent, a person can choose whether they are running because it is lunch or whether they are running because they are lunch, by following a few simple techniques for writing SQL for performance.

Posted May 12, 2011

River Parishes Community College (RPCC) is an open-admission, 2-year, public institution. It is located in the small Ascension Parish town of Sorrento in what is known as the River Parishes region of the state because of the parishes' proximity to the Mississippi River. RPCC recently implemented a new self-service student portal based on Revelation Software's Web 2.0 toolkit, OpenInsight for Web (O4W). The new portal allows students to accomplish a range of tasks on their own, such as scheduling classes, without requiring assistance from school administrators.

Posted May 12, 2011

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31

Sponsors