Self-learning hardware and software
systems that use machine learning, natural language processing, pattern
recognition, human-computer interaction, and data mining technologies to mimic
the human brain are referred to as cognitive computing.
The term "cognitive computing" refers to the use
of advances in cognitive science to create new and complex artificial
intelligence systems.
Cognitive systems aren't designed to take the place of human
thinking, reasoning, problem-solving, or decision-making; rather, they're meant
to supplement or aid people.
A collection of strategies to promote the aims of affective
computing, which entails narrowing the gap between computer technology and
human emotions, is frequently referred to as cognitive computing.
Real-time adaptive learning approaches, interactive cloud
services, interactive memo ries, and contextual understanding are some of these
methodologies.
To conduct quantitative assessments of organized statistical
data and aid in decision-making, cognitive analytical tools are used.
Other scientific and economic systems often include these
tools.
Complex event processing systems utilize complex algorithms
to assess real-time data regarding events for patterns and trends, offer
choices, and make judgments.
These kinds of systems are widely used in algorithmic stock
trading and credit card fraud detection.
Face recognition and complex image recognition are now
possible with image recognition systems.
Machine learning algorithms build models from data sets and
improve as new information is added.
Neural networks, Bayesian classifiers, and support vector
machines may all be used in machine learning.
Natural language processing entails the use of software to
extract meaning from enormous amounts of data generated by human conversation.
Watson from IBM and Siri from Apple are two examples.
Natural language comprehension is perhaps cognitive
computing's Holy Grail or "killer app," and many people associate
natural language processing with cognitive computing.
Heuristic programming and expert systems are two of the
oldest branches of so-called cognitive computing.
Since the 1980s, there have been four reasonably
"full" cognitive computing architectures: Cyc, Soar, Society of Mind,
and Neurocognitive Networks.
Speech recognition, sentiment analysis, face identification,
risk assessment, fraud detection, and behavioral suggestions are some of the
applications of cognitive computing technology.
These applications are referred regarded as "cognitive
analytics" systems when used together.
In the aerospace and defense industries, agriculture, travel
and transportation, banking, health care and the life sciences, entertainment
and media, natural resource development, utilities, real estate, retail,
manufacturing and sales, marketing, customer service, hospitality, and leisure,
these systems are in development or are being used.
Netflix's movie rental suggestion algorithm is an early
example of predictive cognitive computing.
Computer vision algorithms are being used by General
Electric to detect tired or distracted drivers.
Customers of Domino's Pizza can place orders online by
speaking with a virtual assistant named Dom.
Elements of Google Now, a predictive search feature that
debuted in Google applications in 2012, assist users in predicting road
conditions and anticipated arrival times, locating hotels and restaurants, and
remembering anniversaries and parking spots.
In IBM marketing materials, the term "cognitive"
computing appears frequently.
Cognitive computing, according to the company, is a subset
of "augmented intelligence," which is preferred over artificial
intelligence.
The Watson machine from IBM is frequently referred to as a
"cognitive computer" since it deviates from the traditional von
Neumann design and instead draws influence from neural networks.
Neuroscientists are researching the inner workings of the
human brain, seeking for connections between neuronal assemblies and mental
aspects, and generating new mental ideas.
Hebbian theory is an example of a neuroscientific theory
that underpins cognitive computer machine learning implementations.
The Hebbian theory is a proposed explanation for neural
adaptation during the learning process.
Donald Hebb initially proposed the hypothesis in his 1949
book The Organization of Behavior.
Learning, according to Hebb, is a process in which the
causal induction of recurrent or persistent neuronal firing or activity causes
neural traces to become stable.
"Any two cells or systems of cells that are
consistently active at the same time will likely to become'associated,' such
that activity in one favors activity in the other," Hebb added (Hebb 1949,
70).
"Cells that fire together, wire together," is how
the idea is frequently summarized.
According to this hypothesis, the connection of neuronal
cells and tissues generates neurologically defined "engrams" that
explain how memories are preserved in the brain as biophysical or biochemical
changes.
Engrams' actual location, as well as the procedures by which
they are formed, are currently unknown.
IBM machines are stated to learn by aggregating information
into a computational convolution or neural network architecture made up of
weights stored in a parallel memory system.
Intel introduced Loihi, a cognitive chip that replicates the
functions of neurons and synapses, in 2017.
Loihi is touted to be 1,000 times more energy efficient than
existing neurosynaptic devices, with 128 clusters of 1,024 simulated neurons on
per chip, for a total of 131,072 simulated neurons.
Instead of relying on simulated neural networks and parallel
processing with the overarching goal of developing artificial cognition, Loihi
uses purpose-built neural pathways imprinted in silicon.
These neuromorphic processors are likely to play a significant
role in future portable and wire-free electronics, as well as automobiles.
Roger Schank, a cognitive scientist and artificial
intelligence pioneer, is a vocal opponent of cognitive computing technology.
"Watson isn't thinking. You can only reason if you have objectives, plans, and
strategies to achieve them, as well as an understanding of other people's ideas
and a knowledge of prior events to draw on.
"Having a point of view is also beneficial," he
writes.
"How does Watson feel about ISIS, for example?" Is
this a stupid question? ISIS is a topic on which actual thinking creatures have
an opinion" (Schank 2017).
~ Jai Krishna Ponnappan
You may also want to read more about Artificial Intelligence here.
See also:
Computational Neuroscience; General and Narrow AI; Human Brain Project; SyNAPSE.
Further Reading
Hebb, Donald O. 1949. The Organization of Behavior. New York: Wiley.
Kelly, John, and Steve Hamm. 2013. Smart Machines: IBM’s Watson and the Era of Cognitive Computing. New York: Columbia University Press.
Modha, Dharmendra S., Rajagopal Ananthanarayanan, Steven K. Esser, Anthony Ndirango, Anthony J. Sherbondy, and Raghavendra Singh. 2011. “Cognitive Computing.” Communications of the ACM 54, no. 8 (August): 62–71.
Schank, Roger. 2017. “Cognitive Computing Is Not Cognitive at All.” FinTech Futures, May 25. https://www.bankingtech.com/2017/05/cognitive-computing-is-not-cognitive-at-all
Vernon, David, Giorgio Metta, and Giulio Sandini. 2007. “A Survey of Artificial Cognitive Systems: Implications for the Autonomous Development of Mental Capabilities in Computational Agents.” IEEE Transactions on Evolutionary Computation 11, no. 2: 151–80.