Big Data Articles
We are now seeing a seismic shift and increase in the significance of social network data for marketing and brand analysis. The next wave of social network exploitation promises to allow companies to narrowly target consumers and leads to predict market trends, and to more actively influence consumer behavior.
Posted November 19, 2014
Splice Machine today announced the general availability of its Hadoop RDBMS, a platform to build real-time, scalable applications, that incorporates new features that emerged from charter customers using the the beta offering. With the additional new features and the validation from beta customers, Splice Machine 1.0 can support enterprises struggling with their existing databases and seeking to scale-out affordably, said Monte Zweben, co-founder and CEO, Splice Machine.
Posted November 19, 2014
Enterprise NoSQL database platform provider MarkLogic today announced the availability of MarkLogic 8 Early Access Edition, which brings together advanced search, semantics, bitemporal and native JavaScript support into one platform. The overall theme of the MarkLogic 8 is ease of use. "This release is all about bringing MarkLogic to where the developers are and making it easy for them to adopt MarkLogic," said Joe Pasqua, senior vice president, product strategy, MarkLogic.
Posted November 18, 2014
Informatica has announced an expanded partnership with Amazon Web Services (AWS). With this announcement, Informatica Cloud is moving to AWS as its underlying infrastructure.
Posted November 18, 2014
Concurrent has announced the latest version of Driven, a big data application performance-monitoring and management system. Driven is purpose-built to address the challenges of enterprise application development and deployment for business-critical data applications, delivering control and performance management for enterprises seeking to achieve operational excellence.
Posted November 18, 2014
Seagate Technology, a provider of storage solutions, has introduced ClusterStor Hadoop Workflow Accelerator. The solution is expected to be a boon to computationally intensive high performance data analytics environments, enabling them to achieve a significant reduction in data transfer time.
Posted November 17, 2014
The U.S. Department of Energy has awarded IBM contracts valued at $325 million to develop and deliver advanced "data-centric" supercomputing systems at Lawrence Livermore and Oak Ridge National Laboratories. "This architecture is really part of a paradigm that addresses the big data challenge, one we hear about here at IBM all time - which we call data-centric computing. We believe the value of a supercomputer is not only tied to petaflops but also to the speed of insights. We solve this particular challenge working with the labs through an open ecosystem leveraging technologies with our partners at the OpenPower Foundation, NVIDIA and Mellanox," said Tom Rosamilia, senior vice president, IBM Systems and Technology Group, during a webcast to announce the new supercomputing systems.
Posted November 14, 2014
The holiday season is right around the corner, a time for cheer and goodwill towards men. That got me thinking about the whole "most wonderful time of the year" tune playing in the background and how that has some special implications for the SQL Server world. Here's a bit of context: I'm writing this article for you in the midst of the biggest gathering of SQL Server professions in any given year, the PASS Summit. One of the most visible activities when attendees get together for the very first time at the registration desk or the assembly hall for the first keynote address is the huge number of hugs, backslapping, fist bumps, high fives, and a variety of other happy and genuine reunions.
Posted November 12, 2014
Big data tool vendors try to downplay the notion that data warehouses and data marts still need to exist, even in a big data world. Relational DBMSs are painted as "old-fashioned," "yesterday," and "inadequate." They beckon potential customers to take a dip in the refreshing data lake. The fact that big data, in all of its glory, is only part of a larger business intelligence solution is getting lost in the dialog.
Posted November 12, 2014
Talend has introduced a new release of its integration platform. The 5.6 release sets new benchmarks for big data productivity and profiling, innovates in MDM with efficiency controls, and broadens Internet of Things (IoT) device connectivity.
Posted November 12, 2014
Services provider Infosys has formed a strategic partnership with BI and analytics expert Tableau. Infosys will integrateTableau's software into the solutions it deploys to help clients gain more value from big data. Infosys says it will use its global training facilities to increase the number of Tableau analytics experts in the company, ensuring that the benefits of business analytics are included within a wide range of client solutions across multiple industries.
Posted November 11, 2014
Attunity Ltd. has expanded its Attunity CloudBeam solution to support Amazon Web Services customers in moving data from Amazon Relational Database Service (Amazon RDS) to Amazon Redshift. Attunity has also been awarded Big Data Competency status in the AWS Partner Network (APN) Competency Program.
Posted November 11, 2014
Big data and analytics workloads are placing greater demands on the enterprise, and creating the need for software defined infrastructure, says Bernie Spang, VP, Strategy, software-defined environments in IBM Systems & Technology Group. With the volume of data organizations are dealing with today, there is a need to optimize the compute and storage resources. "We can't do it with the traditional, manual, rigid IT environment of the past," said Spang.
Posted November 10, 2014
Cloudant, an IBM company, has introduced an on-premises version of Cloudant software that companies can install in their own data centers to run their own DBaaS. According to the company, the addition of Cloudant Local (IBM Cloudant Data Layer Local Edition), provides customers with a strategy for managing application data with any mix of infrastructure and deployment strategy.
Posted November 10, 2014
Rocket Software has announced Rocket Data Virtualization version 2.1, a mainframe data virtualization solution for universal access to data, regardless of location, interface or format.
Posted November 10, 2014
At the most fundamental level, consider that at the end of the day NoSQL and SQL are essentially performing the same core task — storing data to a storage medium and providing a safe and efficient way to later retrieve said data. Sounds pretty simple — right? Well, it really is with a little planning and research. Here's a simple checklist of 5 steps to consider as you embark into the world of NoSQL databases.
Posted November 05, 2014
Dell Software is collaborating with Microsoft to provide predictive analytics in a hybrid cloud setting and also upgrading its Statistica (formerly StatSoft) advanced analytics platform with enhanced big data capabilities through integration with Kitenga.
Posted November 05, 2014
NuoDB has introduced Swifts Release 2.1 which includes the first phase of its HTAP (Hybrid Transaction/Analytical Processing) capabilities. "HTAP" aims to provide real-time operational intelligence with the goal of allowing businesses to acquire immediate insights that they can then use to optimize their business processes
Posted November 04, 2014
The Oracle Applications Users Group (OAUG), in partnership with the OAUG Hyperion SIG, is hosting its Connection Point - Hyperion Online educational online series Nov. 17-19. The OAUG Connection Point - Hyperion Online features 18 training sessions on topics including budgeting and forecasting; financial management; data management; Oracle Hyperion Tax Provision; product improvements in Oracle Hyperion 11.1.2.3 and 11.1.2.3.500; as well as cloud, SaaS, on-premises and other deployment options.
Posted November 03, 2014
Azul Systems, a provider of Java runtime solutions, and DataStax, which provides DataStax Enterprise (DSE) built on Apache Cassandra, have formed a partnership to allow DSE customers to leverage Azul Zing. According to the companies, Zing is the best JVM for real-time Cassandra deployments, allowing Cassandra to operate more consistently by eliminating JVM-caused response time delays.
Posted November 03, 2014
The GridGain In-Memory Data Fabric has been accepted into the Apache Incubator program under the name "Apache Ignite." GridGain will continue to be a contributor to the Ignite code base while also adding enterprise-grade features to its commercial product. The platform's core code will be managed by the non-profit ASF.
Posted November 03, 2014
Percona, which provides enterprise-grade MySQL Support, consulting, training, managed services, and server development service, will be hosting Percona Live 2014 in London from November 3 to 4.
Posted October 29, 2014
Twitter and IBM have formed a new partnership to help improve organizations' understanding of their customers, markets and trends. The alliance brings together Twitter data with IBM's cloud-based analytics, customer engagement platforms, and consulting services. IBM says the collaboration will focus on 3 key areas.
Posted October 29, 2014
VMware has acquired the assets of Continuent. The Continuent team is joining VMware's Hybrid Cloud Business Unit. The acquisition offers "concrete benefits" to Continuent customers, said Robert Hodges, CEO of Continuent.
Posted October 29, 2014
Within many companies' marketing departments there is a greater emphasis than ever before on using big data to make their products more appealing to customers. A major use for the data is to not only provide the best possible experience for the consumer, but to be able to provide it efficiently. Teradata's enhancements to the Teradata Integrated Marketing Cloud. are aimed at improving digital asset management and performance, real-time interaction management, and use of data in real time.
Posted October 28, 2014
Platfora, which provides a big data analytics platform built natively on Hadoop and Spark, has introduced Platfora 4.0 with advanced visualizations, geo-analytics capabilities, and collaboration features to enable users with a range of skill levels to work iteratively with data at scale.
Posted October 28, 2014
Protegrity, a provider of data security solutions, has announced an expanded partnership with Hadoop platform provider Hortonworks. Protegrity Avatar for Hortonworks extends the capabilities of HDP native security with Protegrity Vaultless Tokenization (PVT) for Apache Hadoop, Extended HDFS Encryption, and the Protegrity Enterprise Security Administrator, for advanced data protection policy, key management and auditing.
Posted October 28, 2014
Big data continues to grow at an exponential rate for many enterprises. One issue that continues to grow as well is the threat to data security.
Posted October 28, 2014
At SAP TechEd & d-code, SAP announced new innovations for the latest release of SAP HANA, the fall update of SAP HANA Cloud Platform, and a new SAP API Management technology.
Posted October 22, 2014
SAP SE has announced the SAP Cloud for Planning solution, an enterprise performance management (EPM) solution designed around user experience and built for the cloud. The SAP Cloud for Planning solution will be built natively on SAP HANA Cloud Platform, the in-memory platform-as-a-service (PaaS) from SAP.
Posted October 22, 2014
SAP and BI provider Birst have formed a partnership to provide analytics in the cloud on the SAP HANA Cloud Platform. This collaboration intends to bring together the next-generation cloud platform from SAP with Birst's two-tier data architecture to provide instant access to an organization's data and help eliminate BI wait time.
Posted October 22, 2014
Oracle Platinum Partner Data Intensity, a provider of Oracle-focused application management and cloud services, has acquired business analytics and database management specialist CLEAR MEASURES. According to Data Intensity, the acquisition will enable it to extend its database technology coverage and remote managed services, as well as enter the analytics and business intelligence services market with proven solutions that are already used in more than 200 customer implementations.
Posted October 22, 2014
Oracle has expanded its data integration portfolio with the addition of Oracle Enterprise Metadata Management, a platform to help organizations govern data across the enterprise including structured and unstructured data, and across Oracle and third-party data integration, database, and business analytics platforms. "This is the first time that we have made a comprehensive offering in the area of metadata management," said Jeff Pollock, vice president of product management for Oracle Data Integration.
Posted October 22, 2014
Apache Hadoop has been a great technology for storing large amounts of unstructured data, but to do analysis, users still need to reference data from existing RDBMS based systems. This topic was addressed in "From Oracle to Hadoop: Unlocking Hadoop for Your RDBMS with Apache Sqoop and Other Tools," a session at the Strata + Hadoop World conference, presented by Guy Harrison, executive director of Research and Development at Dell Software, David Robson, principal technologist at Dell Software, and Kathleen Ting, a technical account manager at Cloudera and a co-author of O'Reilly's Apache Sqoop Cookbook.
Posted October 22, 2014
In his presentation at the Strata + Hadoop World conference, titled "Unseating the Giants: How Big Data is Causing Big Problems for Traditional RDBMSs," Monte Zweben, CEO and co-founder of Splice Machine, addressed the topic of scale-up architectures as exemplified by traditional RDBMS technologies versus scale-out architectures, exemplified by SQL on Hadoop, NoSQL and NewSQL solutions.
Posted October 22, 2014
To help simplify the process for the user with self-service BI tools, Logi Analytics has announced the latest version of its business intelligence platform Logi Info. "Self-service has been around for a while, but it never seems to deliver on its promise. Largely, that is because we are mismatching people and their capabilities with the tool sets and information they need," explained Brian Brinkmann, VP of Product for Logi Analytics.
Posted October 21, 2014
MapR Technologies, one of the top ranked distributors for Hadoop, has announced that MapR-DB is now available for unlimited production use in the freely-downloadable MapR Community Edition. "From a developer standpoint, they can combine the best of Hadoop, which is deep predictive analytics across the data, as well as a NoSQL database for real-time operations," explained Jack Norris, chief marketing officer for MapR Technologies.
Posted October 21, 2014
Companies are facing the "big squeeze" created by IT budgets that are relatively flat, growing by only 3% to 4% a year, versus data growth that is averaging 30% to 40%, and a consensus that data is a valuable commodity that cannot be thrown away, said Monte Zweben, CEO and co-founder of Splice Machine in his presentation at the Strata + Hadoop World conference.
Posted October 21, 2014
Datameer has introduced Datameer 5.0 with Smart Execution, a technology that examines dataset characteristics, analytics tasks and available system resources to determine the most appropriate execution framework for each workload.
Posted October 21, 2014
At Strata + Hadoop World in New York, Microsoft announced an update to Microsoft Azure HDInsight, its cloud-based distribution of Hadoop. Customers can now process millions of Hadoop events in near real time, with Microsoft's preview of support for Apache Storm clusters in Azure HDInsight. In addition, as part of its integration with the Azure platform, Hortonworks announced that the Hortonworks Data Platform (HDP) has achieved Azure Certification.
Posted October 20, 2014