Commercial Intelligence Rotating Header Image


CPC and the Grandmother Neuron

A lot of recent work has advanced the learning of increasingly context-sensitive distributed representations (i.e., so-called ’embeddings’). In particular. DeepMind’s paper on “Contrastive Predictive Coding” (CPC) is particularly interesting and advances on a number of fronts. For example, in wav2vec, Facebook AI Research (FAIR) uses CPC to obtain apparently superior acoustic modeling results to DeepSpeech’s connectionist temporal classification (CTC) approach. In the CPC paper, the following image is particularly striking, harkening back to the early notion of a Grandmother Cell.

grandmother cells resulting from CPC

Confessions of a production rule vendor (part 2)

Going on 5 years ago, I wrote part 1.  Now, finally, it’s time for the rest of the story.


Natural Intelligence

Deep natural language understanding (NLU) is different than deep learning, as is deep reasoning.  Deep learning facilities deep NLP and will facilitate deeper reasoning, but it’s deep NLP for knowledge acquisition and question answering that seems most critical for general AI.  If that’s the case, we might call such general AI, “natural intelligence”.

Deep learning on its own delivers only the most shallow reasoning and embarrasses itself due to its lack of “common sense” (or any knowledge at all, for that matter!).  DARPA, the Allen Institute, and deep learning experts have come to their senses about the limits of deep learning with regard to general AI.

General artificial intelligence requires all of it: deep natural language understanding[1], deep learning, and deep reasoning.  The deep aspects are critical but no more so than knowledge (including “common sense”).[2] (more…)