We are entering a new era in computing that departs form the current model of computing as process automation to provide a collaborative platform for discovery to gain actionable insights.

To do so, research community are including advanced analytics tools in the systems middleware to offer predictive models . Thanks to the advent of Big Data these models can be improved, or “trained”, by exposing them to large data sets that were previously unavailable. The general idea is that instead of instructing a computer what to do, we are going to simply throw data at the problem and tell the computer to figure it out itself.  For this purpose the computer middleware takes functions from the brain like: inference, prediction, correlation, abstraction, … giving to the systems to possibility to do this by themselves. And here it comes the use of cognitive word to describe this new computing.

These reasoning capabilities, data complexity, and time to value expectations are driving the need for a new class of supercomputer systems. It is required a continuous development of supercomputing systems enabling the convergence of advanced analytic algorithms and big data technologies driving new insights based on the massive amounts of available data.  We can identify four foundational building blocks that will help to organize the research agenda in the area:

Cognitive Computing building blocks

We will use the term “Cognitive Computing” (others use Smart Computing, Intelligent Computing, etc.) to label this new type of computing research. But whatever we call it, this change is actually the integration of the best of Analytics knowledge with new Big Data technologies and the awesome power of emerging computational systems to interpret massive amounts of a variety of types of data at an unprecedented rate of speed.

We’ve seen tremendous technological innovation in the big data middleware space over the past years, there’s been a ton of fantastic innovation on the processing layer  (e.g. map reduce programming model) and a ton of innovation on the managing layer (e.g. NoSQL Data Bases). However current middleware software stack do not offers an intelligent middleware that simplifies big data analytics. This fact requires the increase in the middleware software stack performance of today’s Big Data Systems. Inevitably, this will lead to the creation of a new layer, offering learning tools, but at the same time, abstracting lower layers to simplify the big data software stack. We refer to this new layer as the Cognitive Layer. This layer will help to automate predictive analysis preventing users and developers from wasting their time on tedious tasks related to data management and data processing.

Middelware Software Stack

 





 

2017-08-09T12:21:57+00:00 April 22nd, 2015|