Newsletters




Business Intelligence and Analytics

The world of Business Intelligence and Analytics is evolving quickly. Increasingly, the emphasis is on real-time Business Intelligence to enable faster decision making, and on Data Visualization, which enables data patterns to be seen more clearly. Key technologies involved in preparing raw data to be used for Business Intelligence, Reporting and Analytics – including ETL (Extract, Transform, and Load), CDC (Change Data Capture), and Data Deduplication – support a wide range of goals within organizations.



Business Intelligence and Analytics Articles

Arcadia Data, provider of real-time modern business intelligence (BI) platforms, is releasing Arcadia Enterprise 3.3, making significant upgrades to uncover real-time insights. The new features - including Arcadia Smart Acceleration, advanced segmentation and cohort analytics, analytic extensions, and mobile and tablet support - unify real-time insights from streaming data with historical data discovery in a single view.

Posted December 20, 2016

Progress has announced the release of Progress DataDirect for Apache Cassandra, a series of connectors in its DataDirect enterprise data connectivity suite.

Posted December 20, 2016

Oracle's total quarterly cloud revenue was $1.1 billion, for the first quarter over the $1-billion mark.

Posted December 19, 2016

MariaDB Corporation, is launching MariaDB ColumnStore 1.0, an open source columnar storage engine that unites transaction and analytic processing to deliver seamless big data analytics. New features and benefits of ColumnStore 1.0 include lower cost of ownership and better price performance, easier enterprise analytics, and faster, more efficient queries.

Posted December 16, 2016

Pentaho, a Hitachi Group Company, has announced that Pentaho Labs has created an integration with MQTT, a machine-to-machine (M2M) IoT transport protocol, to act as the connecting link between physical devices and the data integration process. MQTT, or MQ Telemetry Transport, was created by IBM in 1999 and used today in a variety of IoT environments, including smart homes, manufacturing, and energy.

Posted December 16, 2016

Infosys, a provider of consulting, technology, outsourcing and next-generation services, and NoSQL database technology provider MongoDB have formed a global partnership designed to help enterprises accelerate their digital transformation and application modernization initiatives.

Posted December 16, 2016

Rocana is releasing Rocana Ops 2.0, updating its event alerting and orchestration capabilities, offering unique visual experience for first responders, and providing wider cloud platform visibility. Now with Rocana Ops 2.0, IT teams can take real-time intelligent action on all of their event data. Rocana Reflex is a new event alerting and orchestration system that enables operations teams to provide smart, instant, and automated reactions based on what is happening in their environment.

Posted December 16, 2016

Databricks is receiving $60 million in a Series C funding round led by New Enterprise Associates (NEA), boosting its commitment to Apache Spark and the Databricks data platform. The company will continue to expand its data platform to make data science and engineering at scale with Spark even better and to integrate with other cloud providers to provide more flexibility for its fast-growing global customer base.

Posted December 16, 2016

Crate.io is releasing CrateDB 1.0, an open source SQL database that enables real-time analytics for machine data applications. CrateDB makes machine data applications that were previously only possible using NoSQL solutions available to mainstream SQL developers.

Posted December 15, 2016

By now it's well documented that employees will take IT tasks into their own hands when IT cannot give them the support they need, and when they need it. The phenomenon is ubiquitous and persistent, even to the point that it's been dubbed "shadow IT." Gartner recently noted that by 2020, a third of successful attacks experienced by enterprises will be on their shadow IT resources.

Posted December 14, 2016

Informatica, a provider of data management solutions, is now offering hourly pricing for Informatica Cloud Services for Microsoft Azure in the Azure Marketplace. Available as a pay-as-you-go hourly pricing model, the solution is designed to help users of the Azure cloud platform and Microsoft Cortana Intelligence Suite get started faster on cloud data integration and management projects. Additionally, users can run enterprise-class integration jobs with no upfront costs, and simplify integration of disparate data sources - on-premises, in the cloud or in a hybrid environment.

Posted December 14, 2016

As 2016 comes to a close, experts at SAP are looking ahead at what 2017 may offer. Ken Tsai, vice president, head of cloud platform and data management, product marketing, at SAP, and David Jonker, senior director, product marketing, at SAP, foresee several trends on the horizon that the company will tackle moving forward.

Posted December 14, 2016

SAP reached more than 4,100 SAP S/4HANA customers globally in the third quarter. The company saw growth from the sales of SAP Hybris and SAP SuccessFactors solutions in SAP's small and medium business (SMB) segment, which is now up to 6,000 customers, including 1,350 that chose SAP SuccessFactors Employee Central, according to SAP.

Posted December 14, 2016

Onapsis, a provider of business-critical application security, has announced certification by SAP for the Onapsis Security Platform 2.0 (OSP). The certification includes SAP S/4HANA and will enable integration with OSP that allows SAP customers to gain visibility into current detailed information about the vulnerability and audit levels of their S/4HANA system alongside that of their entire SAP environment.

Posted December 14, 2016

IDERA, a provider of database lifecycle management solutions, has released Workload Analysis for SAP HANA, which is aimed at enabling a smooth transition to in-memory computing platform as an alternative to traditional, relational databases.

Posted December 14, 2016

IT operations teams have been furiously preparing for this year's holiday season - the retail industry's busiest time of the year for web traffic. Here is a checklist of the key factors IT operations teams need to consider to ensure their IT infrastructure is ready now - and stays that way throughout the shopping season.

Posted December 14, 2016

ZeroPoint technology focuses on analyzing documents, email, web content and server traffic for hazardous content such as malicious code

Posted December 13, 2016

IBM has announced new Bluemix services designed to simplify and speed app development in the cloud. With these new services, developers can now access and construct preconfigured toolchains using popular DevOps tools, including GitHub and Slack.

Posted December 12, 2016

CA Technologies has signed a definitive agreement to acquire Automic Holding GmbH, a provider of software for automating IT and business processes.

Posted December 12, 2016

When software providers consider transitioning to (or at the very least adding) a SaaS offering, they think about the impact to their business of moving from a perpetual license model to a recurring revenue stream. And while it's easy to remember and consider such migration costs as application-level rearchitecture, other upfront and ongoing costs - such as infrastructure and service-related costs - are often severely underestimated.

Posted December 12, 2016

BlueData is announcing the general availability of its BlueData EPIC software on Amazon Web Services (AWS). BlueData EPIC provides the same self-service user experience whether running in the public cloud or a corporate data center. The platform also provides the same Docker-based application images for Hadoop, Spark, and other Big Data tools are portable across on-premises and cloud deployments.

Posted December 09, 2016

Scribe Software is releasing the next generation of Scribe Online, offering a completely revamped user experience and an extended application programming interface. The new update simplifies and accelerates the process of creating and maintaining integrations between cloud services, on-premises applications, and databases.

Posted December 08, 2016

It has become all too clear that no organization is immune from the risk of a data breach, and that anyone accessing data can pose a threat - including trusted employees and partners. Here, IT executives speculate on the impact newer technologies such as IoT, blockchain, and cloud, as well as the need for data protection, including disaster recovery plans, encryption, and comprehensive oversight.

Posted December 07, 2016

Oracle has signed an agreement to acquire Dyn, a cloud-based internet performance and domain name system (DNS) provider that monitors, controls, and optimizes internet applications and cloud services to enable faster access, reduced page load times, and higher end user satisfaction.

Posted December 07, 2016

Oracle Linux 7 Update 3 for x86-64 servers is now generally available, according to Teri Whitaker, program manager at Oracle, who announced the release in a post on Oracle's Linux blog.

Posted December 07, 2016

SnapLogic is receiving $40 million in Series F funding to hasten its reach around the globe and transform ways to integrate data, applications, and devices for digital business. The new round was led by European private equity firm Vitruvian Partners, with further investment from Andreessen Horowitz, Capital One, Ignition Partners, NextEquity Partners, and Triangle Peak Partners. This brings SnapLogic's funding to $136.3 million to date.

Posted December 07, 2016

Dell Boomi is releasing an enhanced version of its integration Platform as a Service (iPaaS) to help organizations rapidly and cost-effectively integrate data across applications in hybrid IT environments. The latest release enables Boomi customers to build a connected business at scale through new features to increase user productivity, speed up implementation, and provide unparalleled control of integrations.

Posted December 07, 2016

MapR Technologies, Inc. is updating its MapR Ecosystem Pack program to include enhancements that will add flexible access and provide new capabilities for streaming applications. The MapR Ecosystem Pack is a broad set of open source ecosystem projects that enable big data applications running on the MapR Converged Data Platform.

Posted December 02, 2016

In what has become a data-driven world, your organization's data is valuable. It has become the "keys to the kingdom," so to speak. Very few companies today could function without data, especially good data. However, I would suggest that more important than data, is information. Data provides the building blocks, but information is really the consumable outcome that can be used as a competitive edge.

Posted December 01, 2016

Terms such as "active," "inactive," and "canceled" may seem mundane and inconsequential, and when folks hear the term,"valid values," their eyes glaze and expectations of interest diminish. But exciting or not, reference values and an understanding of them are important to every organization.

Posted December 01, 2016

If you are a SQL Server professional, but you don't know about the PASS Summit, then you are missing out. The annual conference is convened every fall in downtown Seattle, the backyard of Microsoft, and attracted over 6,000 attendees this year. And, since it's so close to the Microsoft Redmond campus, hundreds of the SQL Server developers and program managers get to attend—answering user questions, delivering sessions, and presenting chalk talks and panel discussions.

Posted December 01, 2016

The clear trend these days is to automate and enable computerized tasks to streamline and optimize administrative and maintenance tasks. Many database management tasks that today require oversight and handholding by DBAs can, over time, be turned over to intelligently automated software to manage. But automation is just the first step.

Posted December 01, 2016

It's been amusing to watch the NoSQL movement transition from a "We don't need no stinking SQL" attitude to a "Can I please have some SQL with that?" philosophy. The nonrelational databases that emerged over the past 8 years initially offered no SQL capabilities. However, today we have an embarrassment of SQL options for "NoSQL." Hive offers SQL for Hadoop systems, Spark has SparkSQL, MongoDB has a SQL-based BI connector, and so on.

Posted December 01, 2016

There are three big challenges facing today's DBAs—a shift to an application-centric focus, the need to support multiple database platforms, and expanding responsibilities for managing database performance in the cloud as well as on premises.

Posted December 01, 2016

Yellowfin, business intelligence (BI) and analytics software vendor, is launching a virtualized data preparation module for analytics, fully integrating it into the metadata layer of its BI platform. Yellowfin's Data Preparation Module will enable organizations to model, profile, clean, shape, enrich, secure, and publish all data for reporting and analytics in a single BI environment.

Posted December 01, 2016

Dataguise - through its DgSecure platform - is now supporting sensitive data discovery on Amazon Redshift and Amazon RDS, as well as Amazon Simple Storage Service (S3). The platform will now scan for sensitive information stored on Amazon Redshift, RDS, and S3 and provide ongoing monitoring of sensitive data in S3 throughout its lifecycle.

Posted December 01, 2016

What's ahead for 2017 in terms of big data and IoT? IT executives reflect on the impact that Spark, blockchain, data lakes, cognitive computing,AI and machine learning, and other cutting-edge approaches may have on data management and analytics over the year ahead.

Posted November 30, 2016

SUSE is acquiring OpenStack IaaS and Cloud Foundry PaaS Talent and Technology Assets from HPE. The agreement aims to accelerate SUSE's entry into the growing Cloud Foundry Platform-as-a-Service (PaaS) market.

Posted November 30, 2016

Splice Machine is releasing an updated version of its signature platform for intelligent applications, strengthening its ability to run enterprise-scale transactional and analytical workloads. Version 2.5 of the Splice Machine platform introduces new abilities including columnar external tables that enables hybrid columnar and row-based querying; in-memory caching via pinning; statistics via sketching; and offers cost optimized storage for AWS users.

Posted November 30, 2016

At HPE Discover London, Hewlett Packard Enterprise (HPE) introduced new solutions designed to help organizations deploy Internet of Things (IoT) devices in wide area, enterprise and industrial environments.

Posted November 30, 2016

VoltDB has partnered with Confluent and completed development and certification of its Sink Connector for Confluent Open Source, based on Apache Kafka, utilizing Kafka's Connect API.

Posted November 30, 2016

Syncsort has announced that it has signed a definitive agreement to acquire Trillium Software. Syncsort, Inc. is an IT solutions provider backed by Clearlake Capital Group, L.P. and Trillium is a provider of data quality solutions.

Posted November 30, 2016

Databricks, the company founded by the creators of the Apache Spark project, has introduced a Health Insurance Portability and Accountability Act (HIPAA)-compliant Apache Spark, cloud-based platform. The announcement was made today at the 2016 Amazon Web Services (AWS) re:Invent conference.

Posted November 30, 2016

Tableau Online, a fully managed SaaS cloud analytics platform, allows organizations to connect, blend and analyze live or extracted data from on-premises and cloud databases such as Amazon Redshift and Teradata, applications such as Salesforce and Google Analytics, and data in spreadsheets stored in computers or online.

Posted November 29, 2016

CA Technologies is incorporating predictive analytics capabilities for monitoring data that touches the mainframe. CA Mainframe Operations Intelligence is a new algorithms-based solution suite that identifies and predicts system issues before they impact performance. Machine learning is applied to make sure there is no downtime and provides the intelligence that reduces the effort and skill to run data centers.

Posted November 28, 2016

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186

Sponsors