Sign up FAST! Login

Big Data is so last year. Machine learning may become the greatest topic in computer science for this century.


http://saffrontech.com/2013/09/18/big-data-requires-cognitive-computing-for-model-free-machine-learning/#!

1980's prom

Big Data Requires Cognitive Computing for Model-Free Machine Learning

“Big Data is so last year.” Well then, what is there left to say? Hunt’s point is that big data methods may help us understand what’s going on in the world; however, new approaches are needed in order to extract value from Big Data.

In this regard to new approaches, I will begin where Gus Hunt concludes his talk: “We’ve got this third wave of computing that has emerged, which are cognitive machines…. This is a world which is going to explode upon us, and cognitive machines are going to do everything from medicine, to financial trading, to helping us with our intelligence analysis across the board.” So what is Cognitive Computing?

Cognitive Computing is defined as a brain-like and more human form of thinking. The ability to learn must be at its core, just as we humans learn.

In the end, AI reasoning and data modeling have little or nothing to do with the natural intelligence of human cognition. Humans learn automatically about the Big World, all the time in real-time. To address Big Data, machines must be able to do the same.

Each Big Data dimension — volume, velocity, variety, and volatility — creates a problem for traditional analytic methods, but the last two dimensions provide good examples of how big data is challenging old approaches such as traditional statistical methods

Variety

As we gather more and more variables about more and more people from more and more sources, each of us begin to appear as a “segment of one”. As consumers, we are unique. As patients, we are unique. A single scoring model or stereotyping market segment cannot possibly account for the richness and detail available about each one of us. You are the one in a model of one, not one in a model of many. In collaborative inferencing from others like you, you define the center of your own unique cluster.

Volatility

A volatile world is not stationary. Therefore, continuous autonomous learning methods are required. Instant learning (compared to gradual adaptation) is even better for instance changes. Otherwise, a data modeler must constantly maintain and rebuild models to avoid model staleness and growing inaccuracy. Instead, there is the opportunity to leverage how our brains learn all the time, constantly assimilating new data with all the knowledge gained from prior data. This is the natural way of learning, of constantly gathering and then exploiting our experience for whatever comes our way.

This is a good article - read more http://saffrontech.com/2013/09/18/big-data-requires-cognitive-computing-for-model-free-machine-learning/#!

Stashed in:

To save this post, select a stash from drop-down menu or type in a new one:

You May Also Like: