From Irving Wladawsky-Berger:
In the first week of October I participated in a Cognitive Systems Colloquium hosted by IBM at its Thomas J. Watson Research Center. IBM defines
cognitive systems as “a category of technologies that uses natural
language processing and machine learning to enable people and machines
to interact more naturally to extend and magnify human expertise and
cognition. These systems will learn and interact to provide expert
assistance to scientists, engineers, lawyers, and other professionals in
a fraction of the time it now takes.”
The
need for such systems is a result of the explosive growth of data all
around us. Not only are we now able to collect huge amounts of
real-time data about people, places and things, but far greater amounts
can be derived from the original data through feature extraction and
contextual analysis. One of the key lessons from Watson, - IBM’s question-answering system which in 2011 won the Jeopardy! Challenge against the two best human Jeopardy! players, - was that the very process of analyzing data increases the amount of data by orders of magnitude.
This is challenging
our ability to store and analyze all that data. The new generation of
cognitive systems will require innovation breakthroughs at every layer
of our IT systems, including technology components, system
architectures, software platforms, programming environment and the
interfaces between machines and humans.
In the opening presentation, IBM Research director John Kelly summarized both the promise and challenges of cognitive systems. Kelly just published Smart Machines: IBM's Watson and the Era of Cognitive Computing co-written with IBM writer and strategist Steve Hamm.
Data-driven cognitive systems
are quite different from the programmable systems we’ve been using for
over 60 years. Just about all computers in use today are based on the Von Neumann architectural principles laid out in 1945 by mathematician John von Neumann. Any problem that can be expressed as a set of instructions can be codified in software and executed in such stored-program
machines. This architecture has worked very well for many different
kinds of scientific, business, government and consumer applications but
is limited in its ability to deal with large amounts and varieties of
unstructured information in real time.
Our brains have evolved to do so quite well over millions of years. For example, Watson
required 85,000 watts of power, compared to around 20 watts for the
brains of the human players. But, while our brains are incredibly
efficient, they can’t keep up on their own with the huge volumes of
information now coming at us from all sides as well as with the
increasing complexity of so many human endeavors, - including medical
diagnoses, financial advice or business strategies.
So,
just like we invented industrial machines to helps us enhance our
strength and speed, we now need to develop this new generation of
machines to help us enahnce our cognitive capabilities. In
fact, the architectures of such cognitive systems have more in common
with the structure of the human brain than with those of classic Von Neumann machines....MORE
HT: vbounded who notes:
"people using data and models they don't understand creates lots of opportunities"