Commercial Intelligence Rotating Header Image

Posts Tagged ‘NLP’

Are vitamins subject to sales tax in California?

What is the part of speech of “subject” in the sentence: Are vitamins subject to sales tax in California? Related questions might include: Does California subject vitamins to sales tax? Does California sales tax apply to vitamins? Does California tax vitamins? Vitamins is the direct object of the verb in each of these sentences, so, [...]

Common sense about deep learning

I regularly build deep learning models for natural language processing and today I gave one a try that has been the leader in the Stanford Question Answering Dataset (SQuAD).  This one is a impressive NLP platform built using PyTorch.  But it’s still missing the big picture (i.e., it doesn’t “know” much). Generally,  NLP systems that [...]

‘believed by many’

A Linguist user recently had a question about part of a sentence that boiled down to something like the following: It is believed by many. The question was whether “many” was an adjective, cardinality, or noun in this sentence.  It’s a reasonable question!

Parsing Winograd Challenges

The Winograd Challenge is an alternative to the Turing Test for assessing artificial intelligence.  The essence of the test involves resolving pronouns.  To date, systems have not fared well on the test for several reasons.  There are 3 that come to mind: The natural language processing involved in the word problems is beyond the state [...]

TA/NLP: It’s a jungle out there!

Text analytics and natural language processing have made tremendous advances in the last few years.  Unfortunately, there is a lot more to understanding natural language that TA/NLP. I was reading a paper today about NLP pipelines for question answering that used machine learning to find what tools are good at what tasks and to configure [...]

Combinatorial ambiguity? No problem!

Working on translating some legal documentations (sales and use tax laws and regulations) into compliance logic, we came across the following sentence (and many more that are even worse): Any transfer of title or possession, exchange, or barter, conditional or otherwise, in any manner or by any means whatsoever, of tangible personal property for a [...]

Simple, Fast, Effective, Active Learning

Recently, we “read” ten thousand recipes or so from a cooking web site.  The purpose of doing so was to produce a formal representation of those recipes for use in temporal reasoning by a robot. Our task was to produce ontology by reading the recipes subject to conflicting goals.  On the one hand, the ontology [...]

It’s hard to reckon nice English

The title is in tribute to Raj Reddy’s classic talk about how it’s hard to wreck a nice beach. I came across interesting work on higher order and semantic dependency parsing today: Turning on the Turbo: Fast Third-Order Non-Projective Turbo Parsers. Priberam: A turbo semantic parser with second order features So I gave the software [...]

Smart Machines And What They Can Still Learn From People – Gary Marcus

This is a must-watch video from the Allen Institute for AI for anyone seriously interested in artificial intelligence.  It’s 70 minutes long, but worth it.  Some of the highlights from my perspective are: 27:27 where the key reason that deep learning approaches fail at understanding language are discussed 31:30 where the inability of inductive approaches [...]

Deep Learning, Big Data and Common Sense

Thanks to John Sowa’s comment on LinkedIn for this link which, although slightly dated, contains the following: In August, I had the chance to speak with Peter Norvig, Director of Google Research, and asked him if he thought that techniques like deep learning could ever solve complicated tasks that are more characteristic of human intelligence, [...]