Newsletters




Data Center Management

Mainframes continue to represent the strong core of Data Center technology, while Virtualization and Cloud are also emerging as important technologies to support efficiency, scalability, and cost containment. Topics critical to Data Center Operations and Computing including hardware and software for Storage, Consolidation, High Availability, Backup & Recovery, and IT Optimization, as well as automated tools that help compensate for the growing Mainframe Skills Shortage.



Data Center Management Articles

Dell is updating its SharePlex database replication and near real-time data integration solution to enable users to replicate Oracle data directly to SAP HANA, Teradata, or EnterpriseDB Postgres.

Posted May 18, 2016

SnapLogic is unveiling new updates to its SnapLogic Elastic Integration Platform that add the ability to integrate streaming data and power big data analytics in the cloud. The Spring 2016 release adds support for Apache Kafka, Microsoft HDInsight, and Google Cloud Storage, plus multiple enhancements that automate data shaping and management tasks.

Posted May 18, 2016

Magnitude Software, a provider of enterprise information management (EIM) software, has acquired Datalytics Technologies LLC, a data warehouse solutions company. Founded in 1984, Datalytics Technologies offers pre-built data warehouse and data mart solutions for SAP's and Oracle's families of ERP systems.

Posted May 18, 2016

AtScale, Inc., which provides a self-service BI platform for Hadoop, has raised a Series B round of $11 million, bringing its total funding to date to $20 million. According to Bruno Aziza, chief marketing officer of AtScale, its platform is different from others in three key ways, making it applicable to use cases in an array of industries including healthcare, telecommunications, retail, and financial services.

Posted May 17, 2016

Bringing another tool to its product portfolio, IDERA has released SQL BI Check, a no-cost option for real-time performance monitoring of SQL Server Analysis Services (SSAS). SQL BI Check, with agentless installation, provides data on performance metrics with graphs and charts that assess and SSAS health.

Posted May 17, 2016

EMC Corp.'s Enterprise Content Division (ECD) is releasing an upgraded version of its EMC InfoArchive platform, enhancing the ability to secure and leverage large amounts of critical data and content.

Posted May 09, 2016

The world's data has doubled in 18 months' time. The industry estimates that the global amount of storage will reach 40 ZB by 2020. Historically, storage architectures were built on solutions that could only scale vertically. This legacy approach to storage presents significant challenges to being able to store the tremendous quantities of data being created today in a way that is cost-effective and maintains high levels of performance. Today, most of the world's data centers are still using vertical scaling solutions for storage, which means that organizations are seeking alternatives that allow them to scale cheaply and efficiently in order to remain competitive. And now, with software defined storage moving forward, we see the use of more scale-out storage solutions in data centers.

Posted May 04, 2016

The elastic and distributed technologies used to run modern applications require a new approach to operations — one that learns about your infrastructure and assists IT operators with maintenance and problem-solving. The inter-dependencies between new applications are creating chaos in existing systems and surfacing the operational challenges of modern systems. Solutions like micro services architectures alleviate the scalability pains of centralized proprietary services but at a tremendous cost in complexity.

Posted May 04, 2016

The latest release of Oracle Database (12.1.0.2) offers a unique set of features that portend increases in application workload execution, especially for analytics and data warehousing queries. This release, debuts Oracle Database In-Memory which provides a new columnar format - the In-Memory Column Store (IMCS) - for data that is likely to be accessed regularly for aggregation or analysis, as well as other features such as In-Memory Aggregation and In-Memory Joins that potentially offer several orders of magnitude of performance improvement. Finally, the new In-Memory Advisor makes short work of determining exactly which database objects are most likely able to take advantage of the IMCS.

Posted May 04, 2016

Being able to assess the effectiveness and performance of your database systems and applications is one of the most important things that a DBA must be able to do. This can include online transaction response time evaluation, sizing of the batch window and determining whether it is sufficient for the workload, end-to-end response time management of distributed workload, and more. But in order to accurately gauge the effectiveness of your current environment and setup, service level agreements, or SLAs, are needed.

Posted May 04, 2016

What makes an IT organization static or dynamic? What triggers an organization to move from one to the other? The transformation is not easy and it certainly does not happen quickly. These questions can also be asked at a personal level. As an IT professional, are you more likely to be static or dynamic?

Posted May 04, 2016

IBM announced an expansion of its flash storage portfolio, which three new all-flash array products incorporating upgraded performance with a minimum latency of 250µs (microsecond).

Posted May 02, 2016

Coho Data, a provider of scale-out flash storage for the enterprise, announced support for OpenStack on DataStream and that the Coho Cinder driver is part of the OpenStack Mitaka released earlier this month.

Posted May 02, 2016

Dell Networking introduced new capabilities for campus and data center environments, including the launch of new cloud-managed wired and wireless solutions powered by Dell and Aerohive, Operating System 10 milestones and in-rack platforms for the data center.

Posted May 02, 2016

Magnitude Software, a provider of Enterprise Information Management (EIM) software, unveiled a new a master data management offering designed to fuel business processes with accurate customer data for informed decision making.

Posted April 27, 2016

BackOffice Associates, a provider of information governance and data modernization solutions, is acquiring CompriseIT, a U.K. consulting firm specializing in helping enterprises adopt SAP Business Suite 4 SAP HANA (SAP S/4HANA). BackOffice Associates' acquisition of CompriseIT is the latest initiative in move to strengthen its expertise in helping customers as they embark on their journey to implement SAP S/4HANA.

Posted April 27, 2016

Cisco is launching an appliance that includes the MapR Converged Data Platform for SAP HANA, making it easier and faster for users to take advantage of big data. The UCS Integrated Infrastructure for SAP HANA is made easy to deploy, speeds time to market, and will reduce operational expenses along with providing users with the flexibility to choose a scale-up (on-premises) or scale-out (cloud) storage strategy.

Posted April 27, 2016

A new survey from Ponemon Institute put the average cost of an unplanned data center outage at $7,900 a minute, a 41% increase from 2010, when the cost per minute at $5,600. Typically, reported incidents lasted 86 minutes, totaling an average of $690,200 of costs. It took 119 minutes to bring the data centers back up.

Posted April 27, 2016

Microsoft has been on a tear for the past couple of years. It has been pushing forward with a very steady stream of powerful new features and capabilities, even entire product lines, within its Data Platform business. But while Microsoft has been hard at work on this deluge of new technologies, it would be completely forgivable if you haven't noticed. The reason it's OK is that Microsoft is advancing on multiple fronts, both in the on-premises product line and even more dramatically with the Azure cloud-based products.

Posted April 27, 2016

Redis Labs, home of Redis, is releasing Redis on flash with standard x86 servers on the cloud along with partnering with Samsung to improve database performance. By running a combination of Redis on flash and DRAM, data center managers will benefit from leveraging the high throughput and low latency characteristics of Redis while achieving substantial cost savings, according to the company.

Posted April 27, 2016

Cloudera, provider of a data management and analytics platform built on Apache Hadoop and open source technologies, has announced the general availability of Cloudera Enterprise 5.7. According to the vendor, the new release offers an average 3x improvement for data processing with added support of Hive-on-Spark, and an average 2x improvement for business intelligence analytics with updates to Apache Impala (incubating).

Posted April 26, 2016

Zscaler, a provider of a security as a service platform, is unveiling a new service that enables organizations to provide access to internal applications and tools while ensuring the security of their networks.

Posted April 26, 2016

Neo Technology, creator of Neo4j, is releasing an improved version of its signature platform, enhancing its scalability, introducing new language drivers and a host of other developer friendly features.

Posted April 26, 2016

The greatest power in using IoT-derived insights is the ability respond to opportunities or threats immediately. However, enterprises largely have focused on historical reporting and will need to significantly modernize their analytics capabilities—both in understanding current events and predicting future outcomes—to take advantage of the new insights that IoT data can bring.

Posted April 25, 2016

The need for data integration has never been more intense than it has been recently. The Internet of Things and its muscular sibling, the Industrial Internet of Things, are now being embraced as a way to better understand the status and working order of products, services, partners, and customers. Mobile technology is ubiquitous, pouring in a treasure trove of geolocation and usage data. Analytics has become the only way to compete, and with it comes a need for terabytes—and gigabytes—worth of data. The organization of 2016, in essence, has become a data machine, with an insatiable appetite for all the data that can be ingested.

Posted April 25, 2016

The core reason for implementing in-memory technology is to improve performance. To help accelerate adoption of in-memory technologies and provide a universal standard for columnar in-memory processing and interchange, the lead developers of 13 major open source big data projects have joined forces to create Apache Arrow, a new top level project within the Apache Software Foundation (ASF).

Posted April 24, 2016

GridGain Systems, provider of enterprise-grade in-memory data fabric solutions based on Apache Ignite, is releasing a new version of its platform. GridGain Professional Edition includes the latest version of Apache Ignite plus LGPL libraries, along with a subscription that includes monthly maintenance releases with bug fixes that have been contributed to the Apache Ignite project but will be included only with the next quarterly Ignite release.

Posted April 20, 2016

It seems every week there is another data breach in the news, which translates to millions and millions of personal records, credit card numbers, and other pieces of confidential information stolen each month. The victims of these breaches include important companies with professional IT staff. Now, you may be thinking: "Shouldn't the network guys be responsible for security?"

Posted April 20, 2016

To help organizations that are being held back from moving enterprise workloads to a public cloud because of business, legislative, or regulatory requirements that restrict where and how they handle data, Oracle has launched a new set of offerings. Introduced at Oracle CloudWorld in Washington, DC, by Thomas Kurian, president, Oracle, "Oracle Cloud at Customer" enables organizations to get the benefits of Oracle's cloud services but in their own data center.

Posted April 20, 2016

When users require access to multiple databases on multiple servers distributed across different physical locations, database security administration can become quite complicated. The commands must be repeated for each database, and there is no central repository for easily modifying and deleting user security settings on multiple databases simultaneously. At a high level, database security boils down to answering four questions.

Posted April 19, 2016

Compuware has added richer visualization to ISPW, its mainframe source code management and release automation solution, and to Topaz, its mainframe developer solution. "As an ever-growing number of non-mainframe applications make an ever-growing number of calls to the mainframe, the agility of large enterprises increasingly depends on their ability to quickly, safely, and efficiently modify mainframe code," said Compuware CEO Chris O'Malley.

Posted April 18, 2016

The OpenPOWER Foundation has introduced more than 50 new infrastructure and software innovations, spanning the entire system stack, including systems, boards, cards and accelerators. Building upon the 30 OpenPOWER-based solutions already in the marketplace, the new offerings add new servers for high-performance computing and cloud deployments, including more than 10 new OpenPOWER servers, offering expanded services for high performance computing and server virtualization.

Posted April 18, 2016

Serena Software, a provider of application development and release management solutions, is shipping a new version of its change, configuration, and release management solution for mainframes running z/OS. Version 8.1.1 of ChangeMan ZMF includes new capabilities to enable mainframe development.

Posted April 18, 2016

Sumo Logic, a provider of cloud-native, machine data analytics services, is unveiling a new platform that natively ingests, indexes, and analyzes structured metrics data, and unstructured log data together in real-time.

Posted April 18, 2016

Teradata, the big data analytics and marketing applications company, is making key investments in the Internet of Things (IoT) and the Analytics of Things (AoT), along with updating its signature platforms.

Posted April 18, 2016

Hortonworks is making several key updates to its platform along with furthering its mission as being a leading innovator of open and connected data solutions by enhancing partnerships with Pivotal and expanding upon established integrations with Syncsort.

Posted April 15, 2016

IDERA is releasing its SQL Inventory Check platform for free to allow database administrators (DBAs) to easily discover servers on the network and verify versions to keep them properly maintained or to prepare for migrations.

Posted April 13, 2016

SnapLogic is releasing its hybrid execution framework Snaplex on the Microsoft Azure Marketplace as Azureplex, giving users the ability to gain business insights faster with self-service data integration from a plethora of sources.

Posted April 07, 2016

Segment, which provides a customer data hub, has introduced a new platform that will give companies access to new types of cloud services data. The new platform, dubbed "Sources," allows users to now export, sync, and store data from CRM, marketing, and payments platforms and move it into Postgres or Amazon Redshift for analysis.

Posted April 07, 2016

Delphix is making major updates to its data operations platform, delivering enhancements to strengthen secure application development in the data center and in the cloud. "One of the main bottlenecks that we hear about over and over is management of all of this data," Dan Graves, vice president of product marketing. "That's where Delphix comes in. Our core value is unlocking that data in a secure way to allow businesses to a have fast, fresh, full production environment."

Posted April 06, 2016

Micro Focus, an enterprise infrastructure solutions provider, plans to acquire Serena Software, Inc., a provider of application lifecycle management software, in a transaction valued at $540 million. The acquisition is expected to close in early May 2016, subject to receipt of competition clearances in the U.S. and Germany.

Posted April 04, 2016

Dynatrace, a performance tools vendor, has formed a global partnership with HCL Technologies, an IT services company. HCL will leverage Dynatrace's digital performance management technologies within HCL's DryICE platform to deliver user experience and application monitoring to HCL customers

Posted April 04, 2016

AppFormix has integrated Intel Resource Director Technology (Intel RDT) into its performance monitoring platform, delivering performance improvements that address the "noisy neighbor" problem common to multi-tenant cloud environments. Intel RDT technology is available in the recently announced Intel Xeon processor E5-2600 v4 product family.

Posted April 04, 2016

Impelsys, provider of publishing and learning technology solutions, and Ontotext, a provider of semantic technology, are releasing an integrated technology offering.

Posted April 04, 2016

IBM says it is making it easier and faster for organizations to access and analyze data in-place on the IBM z Systems mainframe with a new z/OS Platform for Apache Spark. The platform enables Spark to run natively on the z/OS mainframe operating system.

Posted April 04, 2016

The emergence of big data, characterized in terms of its four V's—volume, variety, velocity, and veracity—has created both opportunities and challenges for credit scoring.

Posted April 04, 2016

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148

Sponsors