Five Minute Briefing - Data Center
June 10, 2019
Five Minute Briefing - Data Center: June 10, 2019. Published in conjunction with SHARE Inc., a bi-weekly report geared to the needs of data center professionals.
News Flashes
Cockroach Labs, the company behind CockroachDB, is partnering with AWS, Google Cloud Platform, Microsoft Azure, IBM Cloud, Rackspace, Oracle Cloud, Digital Ocean, and Openshift to enable customers to deploy hybrid/multi-cloud databases that take advantage of the industry-leading advances in consistency, scalability, resilience and data locality delivered by CockroachDB.
Db2 version 11.5 has added drivers for leading AI languages to help ease AI app dev with Db2; new natural language querying and visualization capabilities designed to help speed data search; an updated common SQL engine to make Db2 more accessible to more data sources.
The Open Mainframe Project (OMP), an open source initiative that enables collaboration across the mainframe community to develop shared tool sets and resources, announced the launch of this year's internship program with 9 global students. Each intern will be paired with mentors from member organizations such as Red Hat, IBM, Sine Nomine Associates and SUSE who designed a project to address a specific mainframe development or research challenge.
Red Hat Enterprise Linux 7.7 beta is now available. The beta release was announced in a Red Hat blog post by Chris Baker.The latest update to the stable and more secure Red Hat Enterprise Linux 7 platform marks the final release in the Full Support Phase (formerly known as "Production Phase 1") of the RHEL 7 lifecycle as described in the Red Hat Enterprise Linux Lifecycle.
News From SHARE
Across the United States, colleges and schools are renewing efforts to encourage women of all ages to enter science, technology, engineering, and math (STEM) fields. As companies look to fill their technology-based job openings, many have embarked on diversity initiatives to expand their employee base and talent pipelines.
Think About It
CDC can greatly minimize the amount of data processed; but the cost is that the processes themselves become more complicated and overall storage may be higher. Costs are moved around, the final level of processing becomes focused on the minimal changes, and this minimization is the efficiency to be gained. Moving forward, using the data becomes standardized and ultimately straightforward.