The rising popularity of web 2.0 and cloud computing services has prompted a reexamination of the infrastructure that supports them. More and more people are using web-based communities, hosted services, and applications such as social-networking sites, video-sharing sites, wikis and blogs. And the number of businesses adopting cloud computing applications such as software as a service and hosted services is climbing swiftly.With all this growth, internet data centers are struggling to handle unprecedented workloads, spiraling power costs, and the limitations of the legacy architectures that support these services. The industry has responded by moving towards a "data center 2.0" model where new approaches to data management, scaling, and power consumption enable data center infrastructures to support this growth.
Posted October 13, 2009
In art, the expression "less is more" implies that simplicity of line and composition allow a viewer to better appreciate the individual elements of a piece and their relationship to each other to make a whole. In engineering, "less is more" when you accomplish the same work with fewer moving parts. And when dining out, "less is more" when the portions may be smaller, but the food is so much better and satisfying. In IT, the adage is more accurately stated today as "less does more." As IT increases in complexity, mainframe organizations are being asked to handle greater workloads, bigger databases, more applications, more system resources, and new initiatives. All this, without adding—and sometimes while cutting—staff. In addition, IT is undergoing a serious "mainframe brain drain," as the most experienced technicians retire, taking with them their skills and detailed knowledge of the mainframes' idiosyncrasies.
Posted October 13, 2009
Why do business decision makers need to wait for IT to deliver performance reports on the business? Why can't they build their own reports, and gain rapid access to answer the questions they have?
Posted September 14, 2009
A member of the Quest International Users Group and IT specialist at Shell Canada Ltd., Sue Shaw took on the role of president of the users group in June. She talks with Database Trends and Applications about what drew her in as a member and her goals for the PeopleSoft, JD Edwards, and Oracle Utilities association now that she is at the helm.
Posted September 14, 2009
This year, despite a turbulent economy marked by painful layoffs in many sectors, database professionals appear to be weathering the storm. In fact, database professionals reported higher incomes and bonuses this year over last. Still, a sizeable segment of professionals saw changes in their jobs as a result of economic conditions, and many are concerned going forward about the impact of tighter budgets on their departments' performance.
Posted September 14, 2009
The Swiss National Sound Archives is Switzerland's official depository of audio records. Founded by law in 1987 as a private foundation working in close collaboration with the Swiss National Library in Bern, the mission of the Swiss National Sound Archives is the preservation of the country's audio heritage. Strictly for Switzerland's audio archives, the foundation collects and safeguards anything sound-related, including speeches, theatrical works, interviews, audio books, and all types of music-from rock to classical. It makes these recordings and detailed information about them, such as the people involved in their creation, available through a website accessible to the public in Switzerland's four official languages - German, French, Italian and Romansh - as well as in English.
Posted September 14, 2009
Microsoft SQL Server has been a favorite for years for organizations that want to implement business intelligence (BI) functionality - even in traditionally non-Microsoft shops. Especially since the SQL Server 2005 release, the ROI of a Microsoft solution coupled with the ease of implementation has driven healthy adoption of the DBMS for BI. And the integration of SQL Server 2008 with Microsoft Office, SharePoint Server, and PerformancePoint Services for delivering BI to end users has created an even stronger end-to-end platform.
Posted September 14, 2009
The concept of database sharding has gained popularity over the past several years due to the enormous growth in transaction volume and size of business-application databases. Database sharding can be simply defined as a "shared-nothing" partitioning scheme for large databases across a number of servers, enabling new levels of database performance and scalability. If you think of broken glass, you can get the concept of sharding—breaking your database down into smaller chunks called "shards" and spreading them across a number of distributed servers.
Posted August 14, 2009
Insiders, by virtue of their easy access to organizations' information, systems, and networks, pose a significant risk to employers. Every day, there's a new shocking headline concerning a major network security breach caused (knowingly or unknowingly) by a corporate insider. And the number of security breaches that start from within keep growing—particularly in this down economy, as the number of disgruntled employees escalates. You'd think that large organizations in particular would be rushing to protect themselves from such headlines and liability, but they just aren't getting the message. Nor are they taking the necessary steps to protect themselves from a policy and technical standpoint.
Posted August 14, 2009
There are mainly four kinds of information management professionals in an OLTP environment—data architects, database architects, application DBAs and operational (or production support) DBAs. It should be the aim of an information management professional to master all the four roles and seamlessly shift between them with ease. Once you are able to shift roles easily, be assured that you're adding to the revenue of your business.
Posted August 14, 2009
The data manager's job has never been easy, often presenting significant challenges, including data system rewrites, data security, regulatory compliance, and reporting. And the digital age, with a myriad of new and innovative data sources and more sophisticated analytic models, presents its own unique hurdles to implementing a successful data-management and data-quality program in the modern insurance enterprise.
Posted August 14, 2009
Data encryption performs two purposes: it protects data against internal prying eyes, and it protects data against external threats (hacking, theft of backup tapes, etc.) Encryption in the database tier offers the advantage of database-caliber performance and protection of encryption keys without incurring the overhead and additional cost of using a third-party encryption tool in an application tier.
Posted July 13, 2009
As organizations grow and evolve, they must implement technology changes to accommodate evolving infrastructure needs, often within complex systems running business-critical applications. Along with this, there frequently is an increased demand to reduce the costs of technology by sharing hardware and software resources, a demand many companies try to meet by establishing virtual environments. Virtualization balances the often underutilized physical machines by consolidating them into a single physical host that runs multiple Virtual Machines (VMs) sharing the four core resources—CPU, memory, disks and network cards—of one physical host.
Posted July 13, 2009
Using historically standard analysis techniques related to file placement on disks within the Unisys 2200 environment, it is possible to significantly improve the performance and capacity without significant additional outlays for hardware. Definitions of disk usage and file placement have been identified on a general basis as no longer relevant, as a means of following the "understanding" that modern disks provide sufficient native speed that file placement no longer matters. This is not a valid assumption.
Posted July 13, 2009
In today's competitive and crisis-ridden market, companies are under pressure to rapidly deliver results and make necessary changes—which requires that decision makers have accurate and timely information readily available. However, many executives have doubts about the timeliness of the information they now receive through their current BI and analytics systems.
Posted June 15, 2009
Compared to the myriad group of "integrated" systems that most companies are managing today, master data management (MDM) solutions are much simpler to manage and maintain, and provide companies with more business benefits. Unfortunately, MDM technology is developing a reputation for being complicated and taking a long time to implement when the reality is that the process can be dramatically simplified if companies plan before they implement.
Posted June 15, 2009
Enormous data volumes in complex systems exacting high total cost of ownership (TCO) are endemic in today's enterprises. Must this always be the case? Not for enterprises and agencies using today's advanced data virtualization to simplify data complexity and reduce costs, time to solution and risk.
Posted June 15, 2009
This is a time of great change for data centers. Technology is advancing and getting smarter, and workloads and performance demands keep growing. For this issue of Database Trends and Applications, we sought a range of industry views on the most profound—and perhaps unexpected—changes reshaping data centers and enterprise it.
Posted June 15, 2009
Sybase turned in the best year in its history in 2008, followed by its best-ever first quarter in 2009. Brian Vink chats with Database Trends and Applications about what he sees as the key issues in information management, the company's partnership with SAP and the plans to revamp TechWave this year.
Posted June 15, 2009
In this challenging economy, many IT organizations are putting even greater focus on identifying how to best meet business objectives with fewer people and reduced IT budgets. They are discovering how the mainframe can help them deal with these challenges.
Posted May 15, 2009
Setting up a replication configuration is a fairly standard way to enable disaster recovery (DR) for business-critical databases. In such a configuration, changes from a production or primary system are propagated to a standby or secondary system. One of the important technology decisions that organizations make upfront is the choice of the replication architecture.
Posted May 15, 2009
IT GRC—or, IT governance, risk and compliance—is rapidly gaining the attention of CIOs and CISOs in businesses across the country. After all, the objective of IT GRC is to more efficiently strike an appropriate balance between business reward and business risk, an essential equation that these executives must attain. How does IT GRC help? By replacing traditional, siloed approaches to addressing individual components with a more unified approach that takes advantage of the many commonalities and interrelationships that exist among governance, compliance and risk management.
Posted May 15, 2009
IT managers from organizations of all sizes know the importance of maintaining access to critical applications and data. From irritating "system unavailable" messages to the most unfortunate natural and manmade disasters where entire systems may be lost, the challenge is particularly acute for database-driven, transactional applications and data—the lifeblood of the business. The dynamic, transactional data and applications that comprise, process, manage and leverage critical customer accounts and history, sales, marketing, engineering and operational components keep the organization thriving.
Posted April 15, 2009
Ed Boyajian joined EnterpriseDB, the open source database company whose products and services are based on PostgreSQL, in June, 2008, as president and CEO. Before that, he spent six years in sales leadership roles at Red Hat, including vice president and general manager for North American sales, and vice president, worldwide OEM and North American channels. Recently Boyajian chatted with DBTA about the looming challenges and opportunities for open source in general as well as for EnterpriseDB's Postgres Plus product family.
Posted April 15, 2009
Those of us in the data security industry, practitioners and vendors alike, have been conditioned to think of data protection in terms that are analogous to physical security. Blocking devices and sensors are akin to locks and security systems. This is why for years we have been investing in those technologies that will block out unauthorized connections all the while making information more and more accessible. There is, however, a new world order at hand. Data creation rates now far outpace the ability of IT managers to write security rules, and the number of data breaches and threats that originate from network insiders have proven much more frequent and insidious than even our most dire predictions of five years ago.
Posted April 15, 2009
Every data integration initiative—whether it supports better decision making, a merger/acquisition, regulatory compliance, or other business need—requires a set of processes to be completed before the data can be made available to business users. Though this set of processes is fairly well understood by industry practitioners, there are still many areas left unaddressed and, therefore, the process is time-consuming, inefficient, unpredictable, and costly.
Posted April 15, 2009
Business intelligence (BI) and analytics solutions have been available for years now, and companies have learned to employ these tools for a variety of purposes, from simple report generation and delivery to more sophisticated data integration, executive dashboards, and data mining. They also recognize the need to get beyond spreadsheets, and to be able to provide more sophisticated, pervasive, and automated BI solutions to more end-user decision makers. However, most see their efforts stymied by the historically high cost of BI software and the complexity of available solutions.
Posted March 15, 2009
The time is past when the unique attributes of the MultiValue database model alone provided sufficient justification for the use of the technology, according to Pete Loveless. He explains why MV companies must support interoperability and integration from the ground up, in order to meet the challenges presented by the market now, and in the future.
Posted March 15, 2009
Over the past year, we have seen a number of new entrants in the data warehouse appliance market. What user requirements are driving the launch of these new appliance solutions and are appliances a niche solution, or is this the beginning of a broader-based trend?
Posted March 15, 2009
Many IT and business managers are now familiar with the concept of virtualization, especially as it pertains to the ability to run a secondary operating system within the same hardware that already supports a separate OS brand. Seasoned data center professionals have been aware of virtualization as a capability available on mainframes for years. The ability of virtualization to provide advantages to data center operations in terms of systems consolidation and simplifying administration has been well-documented.
Posted March 15, 2009
Decision-making is no longer restricted to the confines of the office. The need for critical financial metrics for an off-site board meeting, the latest market share reports for a client visit, or timely sales data for a supplier meeting are all examples that highlight the need for anytime, anywhere access to insightful information. If mobile technology is allowing users to check email, download ringtones, play games, manage schedules, and plan tasks, then why should work-related information be left behind? It is not. Mobile business intelligence (MBI), a convergence of business intelligence software, mobile technology, and Internet connectivity, is ensuring that information travels with the mobile workforce.
Posted February 15, 2009
Alvion Technologies provides a web-enabled platform that allows compilers, resellers and managers of marketing lists to easily deliver their product to end-users in support of targeted marketing efforts. Individual customers submit their data and then Alvion runs customer-specific data transformation and uploads the data to production servers, for access by end-users who are the customers of the data owners. If you need, for example, to find consumers within a 35-mile radius of your business that meet a certain profile, you can go online and find lists within Alvion, put in the criteria you are looking for, and those names will be provided to you, via electronic delivery, be it email or download.
Posted February 15, 2009
Virtualization is transforming self-evident physical machines into multiple virtual machines (VMs), which can be cloned instantly at no perceived cost and moved seamlessly from one physical machine to another. While the power of virtualization is enticing, its management implications are daunting. A completely new management protocol is needed to match the dynamic nature of virtual environments and keep pace with their evolution as they move beyond the enterprise and into the cloud.
Posted February 15, 2009
A leading supplier of data integration software for businesses, finding that its developers were spending too much time grappling with data management inside each of its products, adapted its architecture to a service-oriented architecture (SOA) and built its own data services platform (DSP). However, problems arose that required a complete rebuilding of the architecture. The underlying cause of those problems? Poorly architected data access.
Posted February 15, 2009
Complex Event Processing is only a few years old, but it is rapidly entering the mainstream in a large number of fields that require continuous analysis of large volumes of real-time data.
Posted January 15, 2009
Database administrators are critically important contributors in modern enterprises, ensuring that key infrastructure is performing optimally in support of the organization's goals. Like employees in every department, the best DBAs are constantly seeking to increase the value of their contributions and, correspondingly, to increase their compensation and to advance in their organizations. Increasing knowledge and skills and taking more important responsibilities are time-honored methods for career advancement. Here are 10 concrete suggestions for DBAs looking to get ahead.
Posted January 15, 2009
Combining Database Clustering and Virtualization to Consolidate Mission-Critical Servers
Posted January 15, 2009
The old maxim, "may you live in interesting times" certainly holds true for IT managers and professionals these days. The year 2008 was full of changes and challenges, and 2009 promises even more.
Posted December 15, 2008
Data is the byproduct of the information age and is being generated, processed and stored at an exponential rate. Storage area networks (SANs) have become the infrastructure of choice for networking, transporting and storing data traffic. As this trend continues, many IT managers are faced with network congestion and I/O bottlenecks. To alleviate congestion and increase network bandwidth, enterprises are looking to 8Gb/s Fibre Channel technology.
Posted October 15, 2008
Data transmission is growing rapidly, and the digitization of everything from financial transactions to video is enabling organizations to quickly share information with global partners both inside and outside their trusted network. However, many organizations do not recognize the operational, financial and security risks associated with this growing proliferation of perceived secure, user-managed file transfer systems.
Posted October 15, 2008
The industry is buzzing with talk of endpoint virtualization. This innovation is often seen as a means to reduce enterprise endpoint costs and increase the agility of new endpoint deployments. However, as many organizations discovered as they implemented server virtualization, unless such technologies are integrated within a single infrastructure framework that spans both the physical and virtual, they can add rather than reduce complexity and cost.
Posted October 15, 2008
We all know software piracy causes huge financial losses. It has been estimated that the world's software companies are now losing $40 billion in revenue in unlicensed installations. Yet, with all the security technology at our disposal, why isn't piracy going away? While some areas have been able to squelch a certain percentage of software theft, the problem is here to stay. The huge influx of new PC users, the ubiquitous nature of piracy tools over peer-to-peer networks, and the near-impossibility of enforcement across the globe stand in the way of significant progress. Moreover, the outsourcing of development work opens up new worries for those dealing with countries with weak intellectual property (IP) enforcement laws.
Posted September 15, 2008
Implementing comprehensive database security solutions can be an onerous task. Security requirements are always changing and new compliance requirements are constantly emerging. Despite this dynamic environment, there are simple steps that can be undertaken to dramatically and quickly reduce risk. Database security solutions are only as secure as the weakest link. Forward-thinking organizations should begin by addressing the vulnerabilities that are the most obvious and easiest to exploit.
Posted September 15, 2008
Claims that the mainframe is a near-death technology in the mission-critical world of today's robust business intelligence (BI) applications are exaggerated. Conventional wisdom says the mainframe-the "powerhouse" of corporate computing-is simply too costly, too complex and incapable of supporting a comprehensive BI system. Not so.
Posted September 15, 2008
The Capitol Corridor Joint Powers Authority (CCJPA) manages an Amtrak intercity passenger train service in eight Northern California counties, partnering with Amtrak, the Union Pacific Railroad, and Caltrans, the California Department of Transportation. Serving 16 stations along a 170-mile rail corridor, CCJPA offers a convenient way to travel between the Sierra Foothills, Sacramento, the San Francisco Bay Area, San Jose and the rest of the Silicon Valley.
Posted September 15, 2008
Early discussions on SQL Server 2008 seemed to suggest that it would really only be a point release, quite unlike what occurred with SQL Server 2005. Anyone looking at the new and upgraded features in SQL Server 2008 would soon realize that it offers much more than that. Given that SQL Server 2005 took some time to achieve mass adoption, the question that arises is how fast users will migrate to SQL Server 2008.
Posted September 15, 2008
Now more than ever, data has evolved into an asset more strategic and valuable than any raw material or capital construction project. Companies are scrambling to "compete on analytics," recognizing that the one to most effectively leverage information coming out of their systems gets the greatest competitive advantage.
Posted September 15, 2008
When you pick up the morning paper or turn on the news, you don't expect to be reading or listening to a story about your credit or debit card information being at risk. However, recent events indicate - as illustrated by the announcement of security breaches at the Hannaford supermarket chain and the Okemo Mountain Resort in Vermont - this will become an all too common event.
Posted August 15, 2008
Any system needs to be tested. And it's a simple fact that testing is better done by people independent of the system being tested. A different perspective can often highlight new areas of weakness, and there is no conflict of interest in managing a "pass."
Posted August 15, 2008
Psychologist Philip Zimbardo once said, "Situational variables can exert powerful influences over human behavior, more so than we recognize or acknowledge." That certainly appears to be true when we look at how we work with people who provide services to us in our personal lives versus those who do it in the business world. In our personal lives, we tend to hire specialists. Yet, in the business world we always seem to want to take the "holistic" route, i.e., find that one supplier who can do everything for us.
Posted August 15, 2008