Database Management Articles
TransLattice, a provider of distributed databases and application platforms for enterprise, cloud and hybrid environments, has released TransLattice Elastic Database (TED), which the company describes as the world's first geographically distributed relational database management system (RDBMS). A single database can run on multiple TransLattice nodes around the world, allowing for greater data availability, performance, and scalability at a lower cost than traditional databases. "Since we have the ability to pre-position the data close to end users and have a node that's operating on their behalf in distributed queries, we can offer a much higher level of user experience than conventional systems," Michael Lyle, CTO of TransLattice, explains to DBTA. Additionally, TED makes it easier for global enterprises to comply with data jurisdiction policy requirements.
Posted July 31, 2012
SAP AG has announced a free mobile developer license, a new SAP Mobile Apps Partner program, and additional support for integrating the software development frameworks from Adobe, Appcelerator Titanium, and Sencha with the SAP mobile platform. With the new programs, SAP seeks to encourage the developer community to create new mobile apps for business-to-business (B2B) and business-to-consumer (B2C) environments, David Brutman, senior director, Developer Relations at SAP, tells 5 Minute Briefing.
Posted July 30, 2012
Pentaho's Business Analytics 4.5 is now certified on Cloudera's latest releases, Cloudera Enterprise 4.0 and CDH4. Pentaho also announced that its visual design studio capabilities have been extended to the Sqoop and Oozie components of Hadoop. "Hadoop is a very broad ecosystem. It is not a single project," Ian Fyfe, chief technology evangelist at Pentaho, tells DBTA. "Sqoop and Oozie are shipped as part of Cloudera's distribution so that is an important part of our support for Cloudera as well - providing that visual support which nobody else in the market does today."
Posted July 27, 2012
Oracle has introduced a new migration tool that aims to make it easier for users to migrate data from SQL Server to MySQL. The new migration tool is integrated into MySQL Workbench, which allows the visual design, development and administration of MySQL Databases. According to Oracle, with MySQL Workbench, SQL Server developers and DBAs can easily convert existing applications to run on MySQL, both on Windows and other platforms. In addition, to address the growing demand from business analysts using MySQL for data marts and analytic applications, Oracle has announced a new "MySQL for Excel" application plug-in that allows users to import, export and manipulate MySQL data, without requiring prior MySQL technical knowledge.
Posted July 25, 2012
Big data is one of the most significant industry disruptors in IT today. Even in its infancy, it has shown significant ROI and has almost universal relevance to a wide cross-section of the industry. Why? Big data turns traditional information architecture on its head, putting into question commonly accepted notions of where and how data should be aggregated, processed, analyzed, and stored. Enter Hadoop and NoSQL, the open source data-crunching platform. Although these technologies are hotter than an Internet IPO, you simply can't ignore your current investments - those investments in SQL which drive everything from your data warehouse to your ERP, CRM, SCM, HCM and custom applications.
Posted July 25, 2012
Syncsort, a global leader in high-performance data integration solutions, has certified its DMExpress data integration software for high-performance loading of Greenplum Database. Syncsort has also joined the Greenplum Catalyst Developer Program. Syncsort DMExpress software delivers extensive connectivity that makes it easy to extract and transform data from nearly any source, and rapidly load it into the massively parallel processing (MPP) Greenplum Database without the need for manual tuning or custom coding. "IT organizations of all sizes are struggling to keep pace with the spiraling infrastructure demands created by the sheer volume, variety and velocity of big data," says Mitch Seigle, vice president, Marketing and Product Management, Syncsort.
Posted July 25, 2012
SAP AG has announced the general availability of the on-demand edition of SAP Sybase SQL Anywhere. For independent software vendors (ISVs) adding software-as-a-service (SaaS) offerings to their product portfolio or building new SaaS applications, SAP Sybase SQL Anywhere, on-demand edition, is a data management solution that enables ISVs to build, deploy and manage cloud applications as demanded by their application and customers without having to compromise. The new product allows users to take advantage of the cloud's economies of scale while providing them with the tools they require to help ensure they can still treat each of their customers individually.
Posted July 25, 2012
Throughout the 2000s, a huge number of website developers rejected the Enterprise Java or .NET platforms for web development in favor of the "LAMP" stack - Linux, Apache, MySQL and Perl/Python/PHP. Although the LAMP stack was arguably less scalable or powerful than the Java or .NET frameworks, it was typically easier to learn, faster in early stages of development - and definitely cheaper. When enterprise architects designed systems, they often chose commercial application servers and databases (Oracle, Microsoft, IBM). But, when web developers or startups faced these decisions, the LAMP stack was often the default choice.
Posted July 25, 2012
Dell is introducing a new Big Data retention solution aimed at reducing the costs of retaining big data while helping to improve management for easy retrieval and analysis. "Not only are the volumes of data growing - so is its value to the organization," says Darren Thomas, vice president and general manager, Dell Enterprise Storage. "Being able to cost-effectively capture and store all of the relevant data makes it possible to gain insights that support innovation and business value. The key is having the right data management solution to fluidly move data, tier it, dedupe it, protect it and archive it. To us, that means your data is fluid."
Posted July 24, 2012
Radiant Logic, provider of identity virtualization services, has released a federated identity service, RadiantOne VDS 6. The product provides a single access point for enterprise identities, enabling authentication and authorization across multiple identity sources and authentication protocols. Authorized for the cloud and federations in high-volume, heterogeneous environments, VDS 6 supports single sign-on (SSO) access control.
Posted July 19, 2012
In the always-on global economy, access to enterprise data is critical, and interruptions and delays in the flow of information necessary for decision-making can have serious consequences. A new study fielded by Unisphere Research among IOUG members and sponsored by Oracle explores the issues related to planned and unplanned downtime, alongside high availability and disaster recovery solutions. The survey which garnered input from 358 data managers and professionals, finds that at least half of the respondents are working to provide their organizations with real-time or near-real-time data, but the challenge to meeting this goal is increasing as data volumes escalate and the variety and velocity of data heightens as well.
Posted July 19, 2012
Quest Software, which earlier this month announced it had entered into a definitive agreement to be acquired by Dell, has enhanced its unified partner program to better meet the needs of its reseller, distribution, service providers (SPs), and global partners. Launched in July a year ago, the Quest Partner Circle (QPC) is now being expanded to include SPs and global partners. In addition, SPs and global partners now have access to QPC infrastructure benefits, including partner training, certification, marketing, business planning, and dedicated technical support.
Posted July 19, 2012
HiT Software, a provider of data replication and change data capture (CDC) solutions for heterogeneous database environments, has announced the release of DBMoto Cloud Edition, the first product in a family of data replication and CDC software designed to support cloud data traffic. DBMoto Cloud Edition has been designed to deliver rapid, secure data updates between disconnected databases. Data changes are captured and passed through the cloud using web services, providing reliable CDC for satellite enterprise databases that have no direct connection to corporate IT systems.
Posted July 16, 2012
After chatting with my friend and fellow Microsoft MVP Allen White about Windows Server Core on a recent SQLCruise.com excursion, I realized that this is a technology I should be evangelizing more. I hope you've heard about Windows Server Core and are considering using it for your SQL Server, and, indeed, any relational database platform you're currently running on Windows Server. Why?
Posted July 11, 2012
The whole world can be divided into two groups, these being splitters and lumpers. Design battles are waged across conference rooms as debates rage over whether to split or to lump. Splitters take a group of items divide them up into sub-groups and sub-sub-groups occasionally going so far as to end with each lowest level becoming a group of one. On the other side of the design fence, lumpers combine items until everything is abstracted into group objects covering very broad territory, such as a "Party" construct, or ultimately an "Object" object. Within data modeling, arguments arise, such as whether to sub-type an entity. Or perhaps lumping is discussed as the grain of a multidimensional fact is proposed. This debate underlies much of the decision-making involved in determining what domains to create within a data model. The split-versus-lump issue is ubiquitous and universal. The question to split or lump arises across many kinds of choices, in addition to the entity definition, table grain, or the domain grain mentioned in the previous examples; this issue is at the heart of deliberations regarding establishing functions, overriding methods, or composing an organizational structure.
Posted July 11, 2012
Although data integrity is a pervasive problem, there are some data integrity issues that can be cleaned up using a touch of SQL. Consider the common data entry problem of extraneous spaces in a name field. Not only is it annoying, sometimes it can cause the system to ignore relationships between data elements.
Posted July 11, 2012
Ntirety, a pioneer of pioneer of database administration as a service for Microsoft SQL Server, Oracle and MySQL databases, announced that it has successfully completed VMware's Virtualization of Business Critical Applications (VBCA) solution competency for Microsoft SQL Server, Exchange and Oracle Databases.
Posted July 09, 2012
NuoDB, Inc., a Cambridge, Massachusetts-based startup, has announced the release of Beta 7 of its web-scale database. The company says the new release provides five times the scalability of the previous version (50 nodes) among other new features. NuoDB takes a shared nothing, asynchronous, peer-to-peer approach that the company says is ideal for the cloud, while also delivering the power, reliability and functionality of a traditional database. The database is planned to be generally available in Q3 2012.
Posted July 03, 2012
Dell and Quest Software announced they have entered into a definitive agreement for Dell to acquire Quest, an IT management software provider offering a broad selection of solutions that solve the most common and most challenging IT problems. Under terms of the agreement, approved by the boards of directors of both companies, Dell will pay $28 per share in cash for each share of Quest for an aggregate purchase price of approximately $2.4 billion, net of Quest's cash and debt. The transaction is expected to close in Dell's third fiscal quarter, subject to approval by Quest's shareholders and customary conditions.
Posted July 02, 2012
SEPATON, Inc., which provides disk-based data protection solutions specifically designed for large enterprises, has released DeltaStorDBeXstream software, which is intended to enable the backup and restore of large databases at industry-leading rates while also delivering high capacity reduction through byte-differential deduplication. The software is part of the 6.1 release of its enterprise-optimized data protection software, which powers its S2100 systems.
Posted June 20, 2012
Quest Software, Inc. says it has entered into an amendment to a previously announced merger agreement with affiliates of Insight Venture Partners to add Vector Capital as a member of the buyout group, and for an increase in the offer from $23 per share in cash to $25.75 per share in cash. According to Quest, the increased purchase price represents a 33% premium to Quest's closing stock price on the day prior to the announcement of the Insight merger agreement on March 8, 2012.
Posted June 20, 2012
Ntirety's CEO and founder, Michael Corey, has been named both an Oracle ACE and a VMware vExpert. Corey was independently selected for both designations because of his expertise and his public advocacy for Oracle databases and contributions to the VMware community of users. The VMware vExpert title is given annually to individuals for their commitment to sharing knowledge and passion for VMware technology above and beyond their job requirements. The Oracle ACE Program recognizes individuals for advocating the company's technology and applications and is based on the significance of their contributions and activity in the Oracle database community.
Posted June 19, 2012
In the dim, dark past of data warehousing, there was a time when the argument was put forward that "history does not change." It was posited that once a piece of data was received by the data warehouse, it was sacrosanct and nonvolatile. A fact record, once processed, was to remain unchanged forever. Dimensions, due to their descriptive nature, could be changed following the prescribed Type 1, 2, or 3 update strategies, but that was all. It was the expectation that due to their very nature, fact tables would become huge and in being huge would give poor update performance; performance so poor that updates would be virtually impossible to enact.
Posted June 13, 2012
By now, you've heard that Microsoft has publicly released SQL Server 2012. I have to be honest in telling you that it came sooner than I expected, despite my many inside connections at Microsoft. I was fully expecting the RTM to occur a bit before summer, just in time for a spectacular launch at Microsoft TechEd.
Posted June 13, 2012
Unless you plan for and issue regular COMMITs in your database programs, you will be causing locking problems. It is important for every programmer to issue COMMIT statements in all application programs where data is modified (INSERT, UPDATE, and DELETE). A COMMIT externalizes the modifications that occurred in the program since the beginning of the program or the last COMMIT. A COMMIT ensures that all modifications have been physically applied to the database, thereby ensuring data integrity and recoverability. Failing to code COMMITs in a data modification program is what I like to call "Bachelor Programming Syndrome" — in other words, fear of committing.
Posted June 13, 2012
Companies are scrambling to learn all the various ways they can slice, dice, and mine big data coming in from across the enterprise and across the web. But with the rise of big data — hundreds of terabytes or petabytes of data — comes the challenge of where and how all of this information will be stored. For many organizations, current storage systems — disks, tapes, virtual tapes, clouds, inmemory systems — are not ready for the onslaught, industry experts say. There are new methodologies and technologies coming on the scene that may help address this challenge. But one thing is certain: Whether organizations manage their data in their internal data centers, or in the cloud, a lot more storage is going to be needed. As Jared Rosoff, director of customer engagement with 10gen, puts it: "Big data means we need ‘big storage.'"
Posted June 13, 2012
To help customers optimize the value of their technology investments, Oracle has announced Oracle Platinum Services. The Oracle Platinum Services are available at no charge as part of a standard support contract to customers with either Oracle Exadata X2-2 and X2-8, Oracle Exalogic X2-2, and SPARC SuperCluster T4-4 Servers with Exadata storage, Sun ZFS Storage or Oracle's Pillar Axiom 600 Storage. To access Oracle Platinum Services, customers install a patented, secure monitoring gateway that will allow Oracle to deploy quarterly updates on their behalf. With the new services, Oracle is delivering what it thinks is the highest level of service in the industry, says Larry Abramson, senior vice president and general manager, Oracle Advanced Customer Support Services.
Posted June 13, 2012
QlikTech, a business intelligence software provider, has acquired Expressor Software, a Burlington, Massachusetts-based data management software company and QlikTech Qonnect partner. "Expressor fits squarely in our acquisition strategy. We are acquiring complementary, tuck-in technology that will enhance the value we provide to our customers as we further develop and bring to market these solutions. We are also adding more than 20 outstanding people, primarily software engineers, expanding our R&D skill set," said Lars Björk, CEO of QlikTech.
Posted June 13, 2012
Ravi Pendekanti heads Systems Product Marketing for Oracle on a global basis. He has been in the Systems industry for more than two decades, working in the areas of servers, storage, software and networking. In this article, Exabriefing talks with Pendekanti about the Oracle engineered systems approach - what's been learned and why it works.
Posted June 13, 2012
Cloudera has unveiled the fourth generation of its flagship Apache Hadoop data management platform, Cloudera Enterprise. Cloudera Enterprise 4.0 combines the company's Cloudera Manager software with expert technical to provide a turnkey system for deploying and managing Hadoop in production. The company also announced the general availability of CDH4 (Cloudera's Distribution Including Apache Hadoop, version 4), resulting from the successful completion of a beta program among its enterprise customers and partner ecosystem and the contributions of Cloudera's engineering team and the greater Apache open source community.
Posted June 06, 2012
It seems only reasonable that what one person can do, others can learn. On the other hand, taking people through training does not usually result in the creation of great new database administrators (DBAs). It often appears as if those who are exceptional at the craft operate at higher levels as they dive into a problem. Can training alone provide folks with the attention to detail, the urge to keep digging, or the ability to recall minutiae that allow them to rise from simply holding the DBA title to becoming someone who is a great DBA? Or must the genetic potential exist first, and then one might fall into the DBA occupation and astound those around them. It is very hard to say with any degree of certainty whether great DBAs are made or born; yet again the battle between nature and nurture arises.
Posted June 06, 2012
Embarcadero Technologies has introduced a new version of its database management and development platform, DB Power Studio XE3, which offers enhancements to further improve the performance and availability of databases.
Posted June 06, 2012
Informix Genero, a new IBM offering developed under partnership with Four Js Development Tools, is a logical enhancement to the Informix 4GL language and environment that offers extensive capabilities for developing modern web and desktop GUI applications, reports, and web services. With IBM Informix Genero, users can recompile 4GL applications and run them as GUI and web applications while retaining the business logic.
Posted June 05, 2012
Kapow Software, a provider of social media, cloud, and big data solutions, has announced a strategic partnership with Informatica Corporation to deliver Informatica PowerExchange for Kapow Katalyst. This solution harnesses the power of the web, cloud applications and social media, allowing IT and line-of-business users alike to access and extract relevant information from disparate data sources in real time. The solution will be integrated into Informatica 9.5, which is scheduled to be released at the end of June.
Posted May 29, 2012
The PostgreSQL Global Development Group has announced the beta release of PostgreSQL 9.2, which it says will include increases in performance as well as vertical and horizontal scalability.
Posted May 29, 2012
Idera, a provider of application management solutions for Windows and Linux, has introduced SharePoint diagnostic manager 2.7, which provides new capacity planning features enabling administrators to predict when they will need to expand or upgrade hardware in order to prevent performance issues caused by growth of SharePoint sites and content. This release also provides expanded analytics for SQL Server performance, web page health and uptime, and content usage and composition.
Posted May 29, 2012
Despite IT industry talk about the need for real-time data, a new survey of more than 330 data managers and professionals who are subscribers to Database Trends and Applications reveals that access to current data to support decision making is not actually possible for many companies. At least half of companies represented in the survey indicate that relevant data still takes 24 hours or longer to reach decision makers.
Posted May 23, 2012
The term "big data" refers to the massive amounts of data being generated on a daily basis by businesses and consumers alike - data which cannot be processed using conventional data analysis tools owing to its sheer size and, in many case, its unstructured nature. Convinced that such data hold the key to improved productivity and profitability, enterprise planners are searching for tools capable of processing big data, and information technology providers are scrambling to develop solutions to accommodate new big data market opportunities.
Posted May 23, 2012
InterSystems Corporation has launched the next generation of its InterSystems HealthShare strategic informatics platform for interoperability and active analytics. Designed originally for public health information exchanges (HIEs) at regional, state and national levels, HealthShare has been extended and re-architected to also deliver the advanced technologies needed by integrated delivery networks (IDNs). The new version of HealthShare addresses the need in the healthcare field to enable access to up-to-the minute information and to help users more easily gain insight from unstructured content, Dominick Bizzarro, HealthShare global business manager, InterSystems, tells 5 Minute Briefing.
Posted May 22, 2012
For those unable to attend last month's ISUG Virtual ASE presentation, the recorded replay is available to view on-demand. This advanced, technical session presented by Sybase database expert, Edward Stangler of Bradmark Technologies, along with special guest speaker, Rob Verschoor, senior technology evangelist at Sybase, an SAP company, discusses what various SQL, RPCs, stored procedures, and batches look like in the MDA tables and how to calculate their metrics in Sybase ASE 15.0.3 and later. The impact from other Sybase ASE features, such as statement cache and Cluster Edition, are also explored.
Posted May 22, 2012
Confio Software has introduced its Ignite 8.2 database performance software. The rapidly-growing company based in Boulder, Colo., has tripled in size during the past 3 years, delivering tools that identify, pinpoint and resolve pain points in Sybase, Oracle, SQL Server, and DB2 databases, under the theme of "Keep it simple and get to the problem in four clicks." The new version of Ignite 8.2 includes enhanced features offering expanded support for Sybase; Oracle Exadata, RAC and ASM; as well as for SQL Server.
Posted May 22, 2012
Plans are underway for an event specifically focused on Sybase PowerBuilder and tools that will be separate from TechEd but held at the same time and location, according to Christine Weber, marketing manager, Events, at Sybase, an SAP company. There will also be close to 100 hours of sessions specifically focused on Sybase database and analytics products at SAP TechEd 2012, Weber tells 5 Minute Briefing. "It is a good portion and it is focused on the traditional kind of content that we have always done with Sybase." Sybase-specific content will include tips and tricks on how to use existing products, as well as previews of what's ahead in new product releases. Now that the call for papers has closed, Sybase is going through its approval process for the presentations. Well over 200 presentations were submitted - "a good problem to have," Weber notes.
Posted May 22, 2012
SAP AG has announced a range of innovations. The announcements put heavy emphasis on SAP HANA, and focus on three main areas, David Jonker, director of product marketing - Data Management & Analytics, SAP, tells 5 Minute Briefing. One is working with customers to accelerate their existing investment in the SAP landscape, a second emphasis is big data and analytics, and the third is around sparking new innovation by working with customers, partners and startups to leverage real-time analytics to rethink how business gets done.
Posted May 22, 2012