Database Management Articles
Cisco and IBM are partnering to provide instant Internet of Things (IoT) insights at the edge of the network. The new approach combines technology from both companies to allow businesses and organizations in remote and autonomous locations to tap the benefits of IBM's Watson IoT and business analytics technologies and Cisco's edge analytics capabilities.
Posted June 02, 2016
Oracle is introducing version 4.0 of its NoSQL database. First introduced in 2011, the Oracle NoSQL Database is a key-value database that evolved from the company's acquisition of BerkeleyDB Java Edition, a mature, high-performance embeddable database. Ashok Joshi, senior director of NoSQL, Berkeley Database, and Database Mobile Server at Oracle, outlined the key enhancements in the new release.
Posted June 01, 2016
The ability to stand up and voice your opinion about a solution or technology that will not solve a business problem is critical. At times, choices are made for budgetary reasons, and other times, it may be organizational pressure. But regardless of the reason, as IT professionals, we need to be courageous and do the right thing, whatever that is.
Posted June 01, 2016
CSC's board of directors has unanimously approved a plan to merge the company with the Enterprise Services segment of Hewlett Packard Enterprise (HPE. The strategic combination of the two businesses will create what the companies' executives called one of the world's largest pure-play IT services companies. The new company is expected to have annual revenues of $26 billion and more than 5,000 clients in 70 countries.
Posted May 31, 2016
Dynatrace, a management tools provider, has teamed up with Pivotal, a digital services tools vendor, to deploy its application monitoring solutions for the Pivotal Cloud Foundry (PCF) platform. Dynatrace Application Monitoring Service Broker Tile and Buildpack Extensions for Pivotal Cloud Foundry will provide actionable performance insights for businesses with cloud initiatives.
Posted May 31, 2016
For the first time, scientists at IBM Research have demonstrated reliably storing three bits of data per cell using a relatively new memory technology known as phase-change memory (PCM). The current memory landscape spans from venerable DRAM to hard disk drives to ubiquitous flash. But in the last several years, PCM has attracted the industry's attention as a potential universal memory technology based on its combination of read/write speed, endurance, non-volatility and density. For example, PCM doesn't lose data when powered off, unlike DRAM, and the technology can endure at least 10 million write cycles, compared to an average flash USB stick, which tops out at 3,000 write cycles.
Posted May 31, 2016
Thanks to the cloud and other empowering technologies such as Hadoop and Apache Spark, we're at the tipping point for big data. These technologies now provide a path to big data success for companies who otherwise lack the specialized big data skills or heretofore proprietary (and expensive) infrastructure to do it themselves. As 2016 progresses, we'll see the broader market put big data capabilities to work and the benefits of big data will, in turn, spread beyond the privileged few companies that were early big data adopters.
Posted May 25, 2016
COLLABORATE, the annual conference presented each year by the OAUG, IOUG and Quest, provides the opportunity to reflect on key changes in the Oracle ecosystem and allows the users groups to engage with their constituents about the areas of greatest importance. With the COLLABORATE 16 conference now behind her, Dr. Patricia Dues, the new president of the OAUG, talked with DBTA about what OAUG members are concerned with now and how the OAUG is helping them address emerging challenges.
Posted May 25, 2016
EMC Corp.'s Enterprise Content Division (ECD) is releasing an upgraded version of its EMC InfoArchive platform, enhancing the ability to secure and leverage large amounts of critical data and content.
Posted May 25, 2016
Dell is updating its SharePlex database replication and near real-time data integration solution to enable users to replicate Oracle data directly to SAP HANA, Teradata, or EnterpriseDB Postgres.
Posted May 25, 2016
It seems every week there is another data breach in the news, which translates to millions and millions of personal records, credit card numbers, and other pieces of confidential information stolen each month. The victims of these breaches include important companies with professional IT staff. Now, you may be thinking: "Shouldn't the network guys be responsible for security?"
Posted May 25, 2016
Redmonk's annual programming language rankings - based on GitHub and StackOverflow traffic - were recently released and to no one's surprise, JavaScript was ranked as the most popular programming language.
Posted May 18, 2016
Magnitude Software, a provider of enterprise information management (EIM) software, has acquired Datalytics Technologies LLC, a data warehouse solutions company. Founded in 1984, Datalytics Technologies offers pre-built data warehouse and data mart solutions for SAP's and Oracle's families of ERP systems.
Posted May 18, 2016
AtScale, Inc., which provides a self-service BI platform for Hadoop, has raised a Series B round of $11 million, bringing its total funding to date to $20 million. According to Bruno Aziza, chief marketing officer of AtScale, its platform is different from others in three key ways, making it applicable to use cases in an array of industries including healthcare, telecommunications, retail, and financial services.
Posted May 17, 2016
Bringing another tool to its product portfolio, IDERA has released SQL BI Check, a no-cost option for real-time performance monitoring of SQL Server Analysis Services (SSAS). SQL BI Check, with agentless installation, provides data on performance metrics with graphs and charts that assess and SSAS health.
Posted May 17, 2016
MarkLogic has announced that MarkLogic 9 is available for early access. The latest release of the enterprise NoSQL database features enhancements in data integration, security, and manageability.
Posted May 10, 2016
The world's data has doubled in 18 months' time. The industry estimates that the global amount of storage will reach 40 ZB by 2020. Historically, storage architectures were built on solutions that could only scale vertically. This legacy approach to storage presents significant challenges to being able to store the tremendous quantities of data being created today in a way that is cost-effective and maintains high levels of performance. Today, most of the world's data centers are still using vertical scaling solutions for storage, which means that organizations are seeking alternatives that allow them to scale cheaply and efficiently in order to remain competitive. And now, with software defined storage moving forward, we see the use of more scale-out storage solutions in data centers.
Posted May 04, 2016
The latest release of Oracle Database (12.1.0.2) offers a unique set of features that portend increases in application workload execution, especially for analytics and data warehousing queries. This release, debuts Oracle Database In-Memory which provides a new columnar format - the In-Memory Column Store (IMCS) - for data that is likely to be accessed regularly for aggregation or analysis, as well as other features such as In-Memory Aggregation and In-Memory Joins that potentially offer several orders of magnitude of performance improvement. Finally, the new In-Memory Advisor makes short work of determining exactly which database objects are most likely able to take advantage of the IMCS.
Posted May 04, 2016
Being able to assess the effectiveness and performance of your database systems and applications is one of the most important things that a DBA must be able to do. This can include online transaction response time evaluation, sizing of the batch window and determining whether it is sufficient for the workload, end-to-end response time management of distributed workload, and more. But in order to accurately gauge the effectiveness of your current environment and setup, service level agreements, or SLAs, are needed.
Posted May 04, 2016
Many thought it was an early April Fool's Day prank, but it was no joke: On March 7, 2016, Microsoft announced the beta release of SQL Server on Linux with the intention of shipping a full release of the product by April of 2017.
Posted May 04, 2016
Oracle has entered into a definitive agreement to acquire Opower, a provider of customer engagement and energy efficiency cloud services to utilities, in a transaction valued at approximately $532 million. Separately, Oracle also announced another agreement to acquire Textura, a provider of construction contracts and payment management cloud services for in a deal valued at approximately $663 million.
Posted May 04, 2016
What makes an IT organization static or dynamic? What triggers an organization to move from one to the other? The transformation is not easy and it certainly does not happen quickly. These questions can also be asked at a personal level. As an IT professional, are you more likely to be static or dynamic?
Posted May 04, 2016
Say what you will about Oracle, it certainly can't be accused of failing to move with the times. Typically, Oracle comes late to a technology party but arrives dressed to kill.
Posted May 04, 2016
Qlik is unveiling the next version of its Qlik Sense Enterprise platform, combing enterprise readiness and governance with visualization and data preparation capabilities, according to the company.
Posted May 03, 2016
TimeXtender, a provider of data warehouse automation (DWA) software for Microsoft SQL Server, is partnering with Qlik to offer a complete data warehouse and business intelligence package to users across the globe.
Posted May 03, 2016
Analytics provider Lavastorm is partnering with Qlik, a provider of visual analytics software, to offer an integrated solution for a broader swath of users of all skill levels through Qlik Sense.
Posted May 03, 2016
Led by Pivotal, GE Ventures, and GTD Capital, SnappyData has secured $3.65 million in Series A funding that will help it grow and further its technologies.
Posted April 29, 2016
Accenture and Splunk have formed an alliance that integrates Splunk products and cloud services into Accenture's application services, security, and digital offerings. The goal is to help customers improve business outcomes by mining vast amounts of application and operational data to identify trends and opportunities for improvement that were previously difficult to detect.
Posted April 28, 2016
Snowflake Computing, a provider of cloud-based data warehousing, and MicroStrategy, a provider of enterprise software platforms, are teaming up to provide cloud-enabled analytics. The collaboration will build on Snowflake's certified connectivity with MicroStrategy 10 through further product integration and go-to-market collaboration.
Posted April 28, 2016
Magnitude Software, a provider of Enterprise Information Management (EIM) software, unveiled a new a master data management offering designed to fuel business processes with accurate customer data for informed decision making.
Posted April 27, 2016
Database as a service, also known as DBaaS, offers a solution to some key issues that have vexed enterprise database shops for years. That is, how to maintain and update back-end technologies; how to integrate data from multiple, changeable sources without the need to rewrite the applications that depend on them; and how to make data readily accessible to end users who need it regardless of the device they are using.
Posted April 27, 2016
BackOffice Associates, a provider of information governance and data modernization solutions, is acquiring CompriseIT, a U.K. consulting firm specializing in helping enterprises adopt SAP Business Suite 4 SAP HANA (SAP S/4HANA). BackOffice Associates' acquisition of CompriseIT is the latest initiative in move to strengthen its expertise in helping customers as they embark on their journey to implement SAP S/4HANA.
Posted April 27, 2016
Cisco is launching an appliance that includes the MapR Converged Data Platform for SAP HANA, making it easier and faster for users to take advantage of big data. The UCS Integrated Infrastructure for SAP HANA is made easy to deploy, speeds time to market, and will reduce operational expenses along with providing users with the flexibility to choose a scale-up (on-premises) or scale-out (cloud) storage strategy.
Posted April 27, 2016
OpenText, a provider of enterprise information management (EIM) solutions, is releasing an enhanced version of its namesake platform, addressing key areas of the user experience, machine-to-machine integration, automation, and more.
Posted April 27, 2016
Microsoft has been on a tear for the past couple of years. It has been pushing forward with a very steady stream of powerful new features and capabilities, even entire product lines, within its Data Platform business. But while Microsoft has been hard at work on this deluge of new technologies, it would be completely forgivable if you haven't noticed. The reason it's OK is that Microsoft is advancing on multiple fronts, both in the on-premises product line and even more dramatically with the Azure cloud-based products.
Posted April 27, 2016
Analytics software firm FICO has introduced its Decision Management Suite 2.0, an expanded, cloud-based suite of analytic tools and applications for prescriptive analytics and decision management. The new release is an update to the suite which was first launched in 2013. The updated suite tackles problem the company says is common to many big data deployments - the gap between taking the vast amount of available data and translating it in to useful insights to support business decisions.
Posted April 27, 2016
BackOffice Associates, a provider of information governance and data modernization solutions, has launched version 6.5 of its flagship Data Stewardship Platform (DSP). The company is also introducing an updated application for business process governance and application data management.
Posted April 26, 2016
The COLLABORATE 16 conference for Oracle users kicked off with a presentation by Unisphere Research analyst Joe McKendrick who shared insights from a ground-breaking study that examined future trends and technology among 690 members of three major Oracle users groups.
Posted April 25, 2016
The greatest power in using IoT-derived insights is the ability respond to opportunities or threats immediately. However, enterprises largely have focused on historical reporting and will need to significantly modernize their analytics capabilities—both in understanding current events and predicting future outcomes—to take advantage of the new insights that IoT data can bring.
Posted April 25, 2016
The core reason for implementing in-memory technology is to improve performance. To help accelerate adoption of in-memory technologies and provide a universal standard for columnar in-memory processing and interchange, the lead developers of 13 major open source big data projects have joined forces to create Apache Arrow, a new top level project within the Apache Software Foundation (ASF).
Posted April 24, 2016
SAS has launched Viya,, a new analytics and visualization architecture designed to help make analytics accessible to a greater swath of users. According to the vendor, Viya, represents the foundation for a suite of offerings including machine learning, and will help speed the time between early-stage analytical exploration and resulting business value for users of all skill levels.
Posted April 22, 2016
MemSQL, provider of a distributed in-memory database that enables processing of transactions and running of analytics in real time, has closed its series C financing, raising $36 million. This latest MemSQL funding round brings the total amount raised to $85 million.
Posted April 21, 2016
Percona, a provider of MySQL and MongoDB solutions and services, is releasing an updated version of Percona Server for MongoDB .
Posted April 21, 2016